{"id":94987,"date":"2023-11-29T08:00:00","date_gmt":"2023-11-29T16:00:00","guid":{"rendered":""},"modified":"2025-05-27T19:49:01","modified_gmt":"2025-05-28T02:49:01","slug":"exploring-draggan-implementation-using-onnx-runtime","status":"publish","type":"post","link":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/","title":{"rendered":"Exploring DragGAN implementation using ONNX Runtime"},"content":{"rendered":"\n<p>Generative Adversarial Networks (GANs) are deep learning architectures that generate high quality synthetic images of people, animals or objects around us. These networks have enabled us to provide text-based prompts for generating realistic images, modifying existing images and to complete missing information in training datasets. StyleGAN is a type of GAN designed to control the style of the generated images for high quality and detailed results. Once such realistic images are created, any minor tweaks or updates can be performed through the DragGAN model, which allows minor edits to the generated images without having to recreate the images again.<\/p>\n\n\n\n<p>In this Blog we will describe our implementation of the <a href=\"https:\/\/vcai.mpi-inf.mpg.de\/projects\/DragGAN\/\" target=\"_blank\" rel=\"noreferrer noopener\">DragGAN<\/a><sup>2 <\/sup>algorithm, based on <a href=\"https:\/\/github.com\/NVlabs\/stylegan2-ada-pytorch\" target=\"_blank\" rel=\"noreferrer noopener\">StyleGAN<\/a><sup>1<\/sup>, using <a href=\"https:\/\/onnxruntime.ai\/\">ONNX Runtime<\/a>. We will give a technical overview of the architectures, describe the motivation and discuss challenges and their resolution. We have released Python code for navigating the implementation and included a C# example for integrating the models into a native Windows application. We invite readers to explore ONNX Runtime On-Device Training through <a href=\"https:\/\/github.com\/microsoft\/onnxruntime-training-examples\/tree\/master\/DragGAN\" target=\"_blank\" rel=\"noreferrer noopener\">this example<\/a> and leverage it for other image scenarios on edge devices.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"stylegan\">StyleGAN<\/h2>\n\n\n\n<p>The StyleGAN decoder and generator takes as input the latent vector and a set of learned style vectors, which control various aspects of the image&#8217;s appearance, such as its geometry, texture, and color. Through a series of convolutional layers and non-linear operations, the StyleGAN decoder transforms these inputs into a high-resolution image, allowing for the generation of highly customizable and visually convincing artificial images with exceptional control over their visual attributes.&nbsp;<\/p>\n\n\n\n<p>Figure 1 shows an example of StyleGAN code (mapper and decoder) which allows you to take a random vector, map it to a latent vector and generate a detailed image. Converting it to ONNX format is easy and straight forward.<\/p>\n\n\n<figure class=\"wp-block-image aligncenter size-full is-resized\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-1-2023-11-21-145113.webp\" alt=\"a screen shot of a man\" class=\"wp-image-94990 webp-format\" style=\"aspect-ratio:2.424802110817942;width:800px;height:auto\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-1-2023-11-21-145113.webp\"><figcaption class=\"wp-element-caption\"><em>Figure 1 StyleGAN code (left) that produces the image on the right from the metfaces dataset using the seed 71.&nbsp;<\/em><\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"draggan\">DragGAN<\/h2>\n\n\n\n<p>If the image generated by StyleGAN is mostly right and just needs minor tweaks to be perfect, the DragGAN algorithm allows the user to supply constraints or tweaks to modify the image to the desired form. If the latent vector created by StyleGAN is close to the user\u2019s needs, DragGAN can optimize the latent vector, so the resulting image is exactly as the user desired. The user can specify the constraints as pairs of points (source and target) and the DragGAN optimization will result in the source points moving towards their respective target location (Figures 2a, 2b, 2c).<\/p>\n\n\n\n<div class=\"wp-block-group is-layout-constrained wp-block-group-is-layout-constrained\">\n<figure class=\"wp-block-image aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"484\" height=\"484\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/image-1-OS-1.jpg\" alt=\"a man wearing a suit and tie looking at the camera\" class=\"wp-image-95020\" style=\"width:300px\" \/><figcaption class=\"wp-element-caption\">Figure 2a<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-constrained wp-block-group-is-layout-constrained\">\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"484\" height=\"484\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/Image-2-OS.jpg\" alt=\"a man posing for the camera\" class=\"wp-image-95011\" style=\"width:300px\" \/><figcaption class=\"wp-element-caption\">Firgure 2b<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-constrained wp-block-group-is-layout-constrained\">\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"1024\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/image5.gif\" alt=\"a man wearing a suit and tie looking at the camera\" class=\"wp-image-95013\" style=\"width:300px\" srcset=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/image5.gif 1024w, https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/image5-450x450.gif 450w, https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/image5-650x650.gif 650w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">Figure 2c<\/figcaption><\/figure>\n<\/div>\n\n\n\n<p><em>Figures 2a, 2b, 2c\u2014Example of DragGAN in action. Top-Original image + pair of points handles. Middle-Result of the DragGAN optimization process. Bottom-Animation of the intermediate frames during the optimization.<\/em><\/p>\n\n\n\n<p>The following code describes the main optimization loop of the DragGAN algorithm:<\/p>\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-3-2023-11-21-145416.webp\" alt=\"text\" class=\"wp-image-94992 webp-format\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-3-2023-11-21-145416.webp\"><figcaption class=\"wp-element-caption\"><em>Figure 3 Skeleton code of the DragGAN&rsquo;s optimization loop.&nbsp;<\/em><\/figcaption><\/figure>\n\n\n\n<p>In this blog we are skipping a lot of technical details of both StyleGAN and DragGAN. The interested reader is encouraged to review the original papers for a more in-depth discussion of StyleGAN and DragGAN.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"onnx-runtime-ort-and-the-training-apis\">ONNX Runtime (ORT) and the training APIs<\/h2>\n\n\n\n<p>We aimed to showcase the seamless integration of an interactive image manipulation tool into Windows or Linux applications. We started from a PyTorch implementation but also wanted cross-platform compatibility through ONNX Runtime. With its support for various platforms and programming language APIs, ONNX Runtime provides a means to execute our tool as a native application on diverse devices without external dependencies.<\/p>\n\n\n\n<p>Exporting StyleGAN mapper and optimizer models to ONNX is easy using the torch.onnx.export<strong> <\/strong>method. However, the optimizer in DragGAN uses the gradient information that is calculated during the process of StyleGAN to optimize the loss function which is based on the distance between the current user\u2019s handle points to their target locations.&nbsp;<\/p>\n\n\n\n<p>To do that we would have to calculate and use the gradient information from StyleGAN, and that is where we can use the training APIs provided through ORT On-Device Training.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"formulation-as-a-training-problem\">Formulation as a training problem<\/h3>\n\n\n\n<p>ORT On-Device Training enables training models on edge devices without the data ever leaving the device, which is great for personalizing experiences without compromising privacy. So, if training is possible, gradient information must be involved. To use this capability, we need to formulate our algorithm as a learning problem, where the weights we are learning are the part of the latent vector we want to optimize.&nbsp;<\/p>\n\n\n\n<p>This idea leads to the following PyTorch module:<\/p>\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-4-2023-11-21-152406.webp\" alt=\"text\" class=\"wp-image-94993 webp-format\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-4-2023-11-21-152406.webp\"><\/figure>\n\n\n\n<p>We use the gradients calculated during the StyleGAN generator pass in the motion supervision part. The point tracking step can be left outside of this module and implemented in the chosen language as a loop over the feature map (see the draggan_demo.py).<\/p>\n\n\n\n<p>One pass of this model is equivalent to one iteration in the original optimization loop, allowing us to get the intermediate images and give the user visual feedback.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"exporting-the-onnx-model-and-training-artifacts\">Exporting the ONNX Model and training artifacts<\/h3>\n\n\n\n<p>Exporting the main module to ONNX is done using the torch.onnx.export method. The additional artifacts that allow the ORT Framework to perform the optimization are exported using the following code (see the <a href=\"https:\/\/github.com\/microsoft\/onnxruntime-training-examples\/blob\/master\/DragGAN\/draggan_demo.py\" target=\"_blank\" rel=\"noreferrer noopener\">draggan_demo.py<\/a>):<\/p>\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-5-2023-11-21-152717.webp\" alt=\"text\" class=\"wp-image-94994 webp-format\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-5-2023-11-21-152717.webp\"><figcaption class=\"wp-element-caption\"><em>Figure 5 Example code to export the ONNX Training Artifacts (checkpoint data and eval, optimizer and training models)&nbsp;<\/em><\/figcaption><\/figure>\n\n\n\n<p>Above we intentionally refrain from including a loss argument when calling the artifacts.generate_artifacts method. In such cases, ORT assumes that the ONNX model&#8217;s first output serves as the loss.<\/p>\n\n\n\n<p>The shifted patch loss is defined with two tensors, one of which is detached from the computation graph. This detachment signifies the need to eliminate the gradient subgraph originating from the loss gradient related to the detached input (see draggan_demo.py for motion_supervision implementation).&nbsp;<\/p>\n\n\n\n<p>With the creation of the four essential training artifacts\u2014the training model, eval model, optimizer model, and the checkpoint file\u2014we can start the training process with the chosen language binding on the device.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"draggan-optimization-with-onnx-runtime\">DragGAN optimization with ONNX Runtime<\/h2>\n\n\n\n<p>For full details of the optimization process, please see draggan_onnx_demo.py, and specifically the optimize method. The following code snippet give the main idea:<\/p>\n\n\n<figure class=\"wp-block-image aligncenter size-full is-resized\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-6-2023-11-21-153107.webp\" alt=\"text\" class=\"wp-image-94995 webp-format\" style=\"aspect-ratio:1.6547406082289804;width:800px;height:auto\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-6-2023-11-21-153107.webp\"><figcaption class=\"wp-element-caption\"><em>Figure <em>6<\/em> Skeleton code of the DragGAN optimization loop using the ONNX Runtime artifacts.<\/em><\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"updating-the-onnx-graph\">Updating the ONNX graph<\/h2>\n\n\n\n<p>The DragGAN model that was exported to ONNX contains the latent vector as part of its structure. To allow us to use different latent vectors that are generated from different seeds, we update the ONNX model with parameters before running the optimization (see draggan_onnx_demo.py):<\/p>\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-7-2023-11-21-153548.webp\" alt=\"text\" class=\"wp-image-94997 webp-format\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/OS-Image-7-2023-11-21-153548.webp\"><figcaption class=\"wp-element-caption\"><em><em>Figure 7 Illustration of the update_model method, demonstrating the dynamic adjustment of ONNX graph using distinct latent vectors. Model constraints, defined as parameters, interact with the latent vector within the optimization graph.<\/em><\/em><\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"interactive-examples\">Interactive examples<\/h2>\n\n\n\n<p>The examples below demonstrate how images can be modified by specifying the source and target points to get the desired results.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"560\" height=\"240\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/image6.gif\" alt=\"a group of people posing for a photo\" class=\"wp-image-95005\" style=\"width:500px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"426\" height=\"240\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2023\/11\/image7.gif\" alt=\"a cat that is looking at the camera\" class=\"wp-image-95006\" style=\"width:500px\" \/><figcaption class=\"wp-element-caption\"><em><em>Figure 8 Snapshots from an interaction editing session showcasing a practical example in a technical context.<\/em><\/em><\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"getting-started-with-draggan-using-onnx-runtime\">Getting started with DragGAN using ONNX Runtime<\/h2>\n\n\n\n<p>We have published the <a href=\"https:\/\/github.com\/microsoft\/onnxruntime-training-examples\/tree\/master\/DragGAN\" target=\"_blank\" rel=\"noreferrer noopener\">code repository<\/a> for the DragGAN demonstration for experimenting and trying out the model on your own data. The sample demonstrates the ease of use of such models and the benefits of running them on multiple platforms using ORT. We have included additional resources below for a deeper look into on-device training with ONNX Runtime. These resources should help with implementing similar GAN based models for image related scenarios using the ORT on-device solution.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\" \/>\n\n\n\n<p><strong>References:<\/strong><\/p>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li><a href=\"https:\/\/github.com\/NVlabs\/stylegan2-ada-pytorch\" target=\"_blank\" rel=\"noreferrer noopener\">Training Generative Adversarial Networks with Limited Data<\/a>, Paper and Source Code, NVIDIA, 2020&nbsp;<\/li>\n<\/ol>\n\n\n\n<ol start=\"2\" class=\"wp-block-list\">\n<li><a href=\"https:\/\/vcai.mpi-inf.mpg.de\/projects\/DragGAN\/\" target=\"_blank\" rel=\"noreferrer noopener\">Drag Your GAN: Interactive Point-based Manipulation on the Generative Image Manifold<\/a>, Pan et al., SIGGRAPH 2023&nbsp;<\/li>\n<\/ol>\n\n\n\n<ol start=\"3\" class=\"wp-block-list\">\n<li><a href=\"https:\/\/cloudblogs.microsoft.com\/opensource\/2023\/05\/31\/on-device-training-efficient-training-on-the-edge-with-onnx-runtime\/\" target=\"_blank\" rel=\"noreferrer noopener\">On-Device Training: Efficient training on the edge with ONNX Runtime &#8211; Microsoft <\/a><a href=\"https:\/\/cloudblogs.microsoft.com\/opensource\/2023\/05\/31\/on-device-training-efficient-training-on-the-edge-with-onnx-runtime\/\" target=\"_blank\" rel=\"noreferrer noopener\">Open Source<\/a><a href=\"https:\/\/cloudblogs.microsoft.com\/opensource\/2023\/05\/31\/on-device-training-efficient-training-on-the-edge-with-onnx-runtime\/\" target=\"_blank\" rel=\"noreferrer noopener\"> Blog<\/a>&nbsp;<\/li>\n<\/ol>\n\n\n\n<ol start=\"4\" class=\"wp-block-list\">\n<li><a href=\"https:\/\/cloudblogs.microsoft.com\/opensource\/2023\/07\/05\/on-device-training-with-onnx-runtime-a-deep-dive\/\" target=\"_blank\" rel=\"noreferrer noopener\">ORT On-Device Training Deep Dive blog<\/a>&nbsp;<\/li>\n<\/ol>\n\n\n\n<ol start=\"5\" class=\"wp-block-list\">\n<li><a href=\"https:\/\/onnxruntime.ai\/docs\/get-started\/training-on-device.html\">Getting Started<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/github.com\/microsoft\/onnxruntime-training-examples\/tree\/master\/on_device_training\">Examples repo<\/a><\/li>\n<\/ol>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Get a technical overview of the Microsoft implementation of the DragGAN2 algorithm using ONNX Runtime.<\/p>\n","protected":false},"author":6220,"featured_media":95489,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"msxcm_post_with_no_image":false,"ep_exclude_from_search":false,"_classifai_error":"","_classifai_text_to_speech_error":"","footnotes":""},"post_tag":[2271],"content-type":[],"topic":[2238],"programming-languages":[],"coauthors":[2029,2099],"class_list":["post-94987","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","tag-community-partners","topic-ai-machine-learning","review-flag-1593580419-521","review-flag-1-1593580432-963","review-flag-2-1593580437-411","review-flag-3-1593580442-169","review-flag-5-1593580453-725","review-flag-6-1593580457-852","review-flag-7-1593580463-151","review-flag-8-1593580468-572","review-flag-lever-1593580265-989"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.2 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Exploring DragGAN implementation using ONNX Runtime | Microsoft Open Source Blog<\/title>\n<meta name=\"description\" content=\"Learn more about the ease of implementing GAN based models using the ONNX Runtime platform.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Exploring DragGAN implementation using ONNX Runtime | Microsoft Open Source Blog\" \/>\n<meta property=\"og:description\" content=\"Learn more about the ease of implementing GAN based models using the ONNX Runtime platform.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/\" \/>\n<meta property=\"og:site_name\" content=\"Microsoft Open Source Blog\" \/>\n<meta property=\"article:published_time\" content=\"2023-11-29T16:00:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-05-28T02:49:01+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/STB13_Michelle_03.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1170\" \/>\n\t<meta property=\"og:image:height\" content=\"640\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Baiju Meswani, Ran Gal\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@OpenAtMicrosoft\" \/>\n<meta name=\"twitter:site\" content=\"@OpenAtMicrosoft\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Baiju Meswani, Ran Gal\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 min read\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/\"},\"author\":[{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/author\/baiju-meswani\/\",\"@type\":\"Person\",\"@name\":\"Baiju Meswani\"},{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/author\/ran-gal\/\",\"@type\":\"Person\",\"@name\":\"Ran Gal\"}],\"headline\":\"Exploring DragGAN implementation using ONNX Runtime\",\"datePublished\":\"2023-11-29T16:00:00+00:00\",\"dateModified\":\"2025-05-28T02:49:01+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/\"},\"wordCount\":1297,\"publisher\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/STB13_Michelle_03.webp\",\"keywords\":[\"Community\/partners\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/\",\"url\":\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/\",\"name\":\"Exploring DragGAN implementation using ONNX Runtime | Microsoft Open Source Blog\",\"isPartOf\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/STB13_Michelle_03.webp\",\"datePublished\":\"2023-11-29T16:00:00+00:00\",\"dateModified\":\"2025-05-28T02:49:01+00:00\",\"description\":\"Learn more about the ease of implementing GAN based models using the ONNX Runtime platform.\",\"breadcrumb\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#primaryimage\",\"url\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/STB13_Michelle_03.webp\",\"contentUrl\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/STB13_Michelle_03.webp\",\"width\":1170,\"height\":640},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/opensource.microsoft.com\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Exploring DragGAN implementation using ONNX Runtime\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#website\",\"url\":\"https:\/\/opensource.microsoft.com\/blog\/\",\"name\":\"Microsoft Open Source Blog\",\"description\":\"Open dialogue about openness at Microsoft \u2013 open source, standards, interoperability\",\"publisher\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/opensource.microsoft.com\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#organization\",\"name\":\"Microsoft Open Source Blog\",\"url\":\"https:\/\/opensource.microsoft.com\/blog\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2019\/08\/Microsoft-Logo.png\",\"contentUrl\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2019\/08\/Microsoft-Logo.png\",\"width\":259,\"height\":194,\"caption\":\"Microsoft Open Source Blog\"},\"image\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/OpenAtMicrosoft\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Exploring DragGAN implementation using ONNX Runtime | Microsoft Open Source Blog","description":"Learn more about the ease of implementing GAN based models using the ONNX Runtime platform.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/","og_locale":"en_US","og_type":"article","og_title":"Exploring DragGAN implementation using ONNX Runtime | Microsoft Open Source Blog","og_description":"Learn more about the ease of implementing GAN based models using the ONNX Runtime platform.","og_url":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/","og_site_name":"Microsoft Open Source Blog","article_published_time":"2023-11-29T16:00:00+00:00","article_modified_time":"2025-05-28T02:49:01+00:00","og_image":[{"width":1170,"height":640,"url":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/STB13_Michelle_03.png","type":"image\/png"}],"author":"Baiju Meswani, Ran Gal","twitter_card":"summary_large_image","twitter_creator":"@OpenAtMicrosoft","twitter_site":"@OpenAtMicrosoft","twitter_misc":{"Written by":"Baiju Meswani, Ran Gal","Est. reading time":"5 min read"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#article","isPartOf":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/"},"author":[{"@id":"https:\/\/opensource.microsoft.com\/blog\/author\/baiju-meswani\/","@type":"Person","@name":"Baiju Meswani"},{"@id":"https:\/\/opensource.microsoft.com\/blog\/author\/ran-gal\/","@type":"Person","@name":"Ran Gal"}],"headline":"Exploring DragGAN implementation using ONNX Runtime","datePublished":"2023-11-29T16:00:00+00:00","dateModified":"2025-05-28T02:49:01+00:00","mainEntityOfPage":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/"},"wordCount":1297,"publisher":{"@id":"https:\/\/opensource.microsoft.com\/blog\/#organization"},"image":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#primaryimage"},"thumbnailUrl":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/STB13_Michelle_03.webp","keywords":["Community\/partners"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/","url":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/","name":"Exploring DragGAN implementation using ONNX Runtime | Microsoft Open Source Blog","isPartOf":{"@id":"https:\/\/opensource.microsoft.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#primaryimage"},"image":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#primaryimage"},"thumbnailUrl":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/STB13_Michelle_03.webp","datePublished":"2023-11-29T16:00:00+00:00","dateModified":"2025-05-28T02:49:01+00:00","description":"Learn more about the ease of implementing GAN based models using the ONNX Runtime platform.","breadcrumb":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#primaryimage","url":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/STB13_Michelle_03.webp","contentUrl":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/STB13_Michelle_03.webp","width":1170,"height":640},{"@type":"BreadcrumbList","@id":"https:\/\/opensource.microsoft.com\/blog\/2023\/11\/29\/exploring-draggan-implementation-using-onnx-runtime\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/opensource.microsoft.com\/blog\/"},{"@type":"ListItem","position":2,"name":"Exploring DragGAN implementation using ONNX Runtime"}]},{"@type":"WebSite","@id":"https:\/\/opensource.microsoft.com\/blog\/#website","url":"https:\/\/opensource.microsoft.com\/blog\/","name":"Microsoft Open Source Blog","description":"Open dialogue about openness at Microsoft \u2013 open source, standards, interoperability","publisher":{"@id":"https:\/\/opensource.microsoft.com\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/opensource.microsoft.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/opensource.microsoft.com\/blog\/#organization","name":"Microsoft Open Source Blog","url":"https:\/\/opensource.microsoft.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/opensource.microsoft.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2019\/08\/Microsoft-Logo.png","contentUrl":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2019\/08\/Microsoft-Logo.png","width":259,"height":194,"caption":"Microsoft Open Source Blog"},"image":{"@id":"https:\/\/opensource.microsoft.com\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/OpenAtMicrosoft"]}]}},"msxcm_display_generated_audio":false,"msxcm_animated_featured_image":null,"distributor_meta":false,"distributor_terms":false,"distributor_media":false,"distributor_original_site_name":"Microsoft Open Source Blog","distributor_original_site_url":"https:\/\/opensource.microsoft.com\/blog","push-errors":false,"_links":{"self":[{"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/posts\/94987","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/users\/6220"}],"replies":[{"embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/comments?post=94987"}],"version-history":[{"count":24,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/posts\/94987\/revisions"}],"predecessor-version":[{"id":97475,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/posts\/94987\/revisions\/97475"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/media\/95489"}],"wp:attachment":[{"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/media?parent=94987"}],"wp:term":[{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/post_tag?post=94987"},{"taxonomy":"content-type","embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/content-type?post=94987"},{"taxonomy":"topic","embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/topic?post=94987"},{"taxonomy":"programming-languages","embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/programming-languages?post=94987"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/coauthors?post=94987"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}