{"id":84863,"date":"2021-02-18T09:00:03","date_gmt":"2021-02-18T17:00:03","guid":{"rendered":"https:\/\/cloudblogs.microsoft.com\/opensource\/?p=84863"},"modified":"2025-06-23T10:51:30","modified_gmt":"2025-06-23T17:51:30","slug":"create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise","status":"publish","type":"post","link":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/","title":{"rendered":"Create privacy-preserving synthetic data for machine learning with SmartNoise"},"content":{"rendered":"\n<p><a href=\"https:\/\/aiplus.odsc.com\/courses\/privacy-preserving-analytics-and-machine-learning-with-differential-privacy\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">Watch<\/span><span data-contrast=\"none\">&nbsp;our webinar<\/span><span data-contrast=\"none\">&nbsp;on&nbsp;<\/span><span data-contrast=\"none\">Open Data Science Conference<\/span><\/a><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;<\/span><\/p>\n\n\n\n<p><a href=\"https:\/\/azure.microsoft.com\/en-us\/resources\/microsoft-smartnoisedifferential-privacy-machine-learning-case-studies\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">Read&nbsp;<\/span><span data-contrast=\"none\">the white paper on SmartNoise Differential Privacy<\/span><span data-contrast=\"none\">&nbsp;machine learning case studies<\/span><\/a><\/p>\n\n\n\n<p>The COVID-19 pandemic demonstrates the tremendous importance of sufficient and relevant data for research, causal analysis, government action, and medical progress. However, for understandable data protection considerations, individuals and decision-makers are often very reluctant to share personal or sensitive data. To ensure sustainable progress, we need new practices that enable insights from personal data while reliably protecting individuals&#8217; privacy.<\/p>\n\n\n\n<p>Pioneered by Microsoft Research and their collaborators, differential privacy is the gold standard for protecting data in applications that prepare and publish statistical analyses. Differential privacy provides a mathematically measurable privacy guarantee to individuals by adding a carefully tuned amount of statistical noise to sensitive data or computations. It offers significantly higher privacy protection levels than commonly used disclosure limitation practices like data anonymization. The latter increasingly shows vulnerability to re-identification attacks\u2014especially as more data about individuals become publicly available.<\/p>\n\n\n\n<p>SmartNoise is jointly developed by Microsoft and Harvard&#8217;s Institute for Quantitative Social Science (IQSS) and the School of Engineering and Applied Sciences (SEAS) as part of the <a href=\"https:\/\/projects.iq.harvard.edu\/opendp\" target=\"_blank\" rel=\"noopener\">Open Differential Privacy (OpenDP) initiative<\/a>. The <a href=\"https:\/\/cloudblogs.microsoft.com\/opensource\/2020\/05\/19\/new-differential-privacy-platform-microsoft-harvard-opendp\/\" target=\"_blank\" rel=\"noopener\">platform&#8217;s initial version was released in May 2020<\/a> and comprises mechanisms for providing differentially private results to users of analytical queries to protect the underlying dataset. The SmartNoise system includes differentially private algorithms, techniques for managing privacy budgets for subsequent queries, and other capabilities.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/DP-Diagram-1024x418.webp\" alt=\"Workflow of a user submitting a query to a database that is protected by the SmartNoise system. After the query is processed by the privacy module and the budget store, the user receives differentially private results (e.g. counts, averages).\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>Check out the <a href=\"https:\/\/smartnoise.org\/\" target=\"_blank\" rel=\"noopener\">SmartNoise website<\/a> to learn more. The code for the updated open source differential privacy platform is <a href=\"https:\/\/github.com\/opendifferentialprivacy\" target=\"_blank\" rel=\"noopener\">available on GitHub<\/a>.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"privacy-preserving-synthetic-data\">Privacy-preserving synthetic data<\/h2>\n\n\n\n<p>With the new release of SmartNoise, we are adding several synthesizers that allow creating differentially private datasets derived from unprotected data.<\/p>\n\n\n\n<p>A differentially private synthetic dataset is generated from a statistical model based on the original dataset. The synthetic dataset represents a &#8220;fake&#8221; sample derived from the original data while retaining as many statistical characteristics as possible. The essential advantage of the synthesizer approach is that the differentially private dataset can be analyzed any number of times without increasing the privacy risk. Therefore, it enables collaboration between several parties, democratizing knowledge, or open dataset initiatives.<\/p>\n\n\n\n<p>While the synthetic dataset embodies the original data&#8217;s essential properties, it is mathematically impossible to preserve the full data value and guaranteeing record-level privacy at the same time. Usually, we can&#8217;t perform arbitrary statistical analysis and machine learning tasks on the synthesized dataset to the same extent as it is possible with the original data. Therefore, the type of downstream job should be considered before the data is synthesized.<\/p>\n\n\n\n<p>For instance, the workflow for generating a synthetic dataset for supervised machine learning with SmartNoise looks as follows:<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/Picture14.png\" alt=\"High level workflow how a dataset is synthesized for a machine learning task with SmartNoise: The original tabular dataset contains of features and labels. The QUAIL-method combines a synthesizer and a differentially private classifier to generate a new differentially private dataset that contains the statistical properties of the original data.\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>Various techniques exist to generate differentially private synthetic data, including approaches based on deep neural networks, auto-encoders, and generative adversarial models.<\/p>\n\n\n\n<p>The new release of SmartNoise includes the following data synthesizers:<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table><tbody><tr><th><span style=\"color: #ffffff\">Supported Synthesizer&nbsp;<\/span><\/th><th><span style=\"color: #ffffff\">Overview&nbsp;<\/span><\/th><\/tr><tr><td><span data-contrast=\"auto\">Multiplicative&nbsp;Weights Exponential Mechanism (MWEM)<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:259}\">&nbsp;<\/span><\/td><td>\n<ul>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"1\" data-aria-level=\"1\"><span data-contrast=\"auto\">Achieves Differential Privacy by combining Multiplicative Weights and Exponential Mechanism techniques<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"2\" data-aria-level=\"1\"><span data-contrast=\"auto\">A relatively simple but effective approach<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"3\" data-aria-level=\"1\"><span data-contrast=\"auto\">Requires fewer computational resources, shorter runtime<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<\/ul>\n<\/td><\/tr><tr><td><span data-contrast=\"auto\">Differentially Private Generative Adversarial Network (DPGAN)<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:259}\">&nbsp;<\/span><\/td><td>\n<ul>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"1\" data-aria-level=\"1\"><span data-contrast=\"auto\">Adds noise to the discriminator of the GAN to enforce Differential Privacy<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"2\" data-aria-level=\"1\"><span data-contrast=\"auto\">Has been used with image data and electronic health records (HER)<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<\/ul>\n<\/td><\/tr><tr><td><span data-contrast=\"auto\">Private Aggregation of Teacher Ensembles Generative Adversarial Network (PATEGAN)<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:259}\">&nbsp;<\/span><\/td><td>\n<ul>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"1\" data-aria-level=\"1\"><span data-contrast=\"auto\">A modification of the PATE framework that is applied to GANs to preserve Differential Privacy of synthetic data<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"2\" data-aria-level=\"1\"><span data-contrast=\"auto\">Improvement of DPGAN, especially for classification tasks<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<\/ul>\n<\/td><\/tr><tr><td><span data-contrast=\"auto\">DP-CTGAN<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:259}\">&nbsp;<\/span><\/td><td>\n<ul>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"1\" data-aria-level=\"1\"><span data-contrast=\"auto\">Takes the state-of-the-art CTGAN for synthesizing tabular data and applies DPSGD (the same method for ensuring Differential Privacy that DPGAN uses)<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"2\" data-aria-level=\"1\"><span data-contrast=\"auto\">S<\/span><span data-contrast=\"auto\">uited for tabular data, avoids issues with mode collapse<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"3\" data-aria-level=\"1\"><span data-contrast=\"auto\">Can lead to e<\/span><span data-contrast=\"auto\">xtensive training times<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<\/ul>\n<\/td><\/tr><tr><td><span data-contrast=\"auto\">PATE-CTGAN<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:259}\">&nbsp;<\/span><\/td><td>\n<ul>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"1\" data-aria-level=\"1\"><span data-contrast=\"auto\">Takes the state-of-the-art CTGAN for synthesizing tabular data and applies PATE (the same method for ensuring Differential Privacy that PATEGAN uses)<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"2\" data-aria-level=\"1\"><span data-contrast=\"auto\">S<\/span><span data-contrast=\"auto\">uited for tabular data, avoids issues with mode collapse<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<\/ul>\n<\/td><\/tr><tr><td><span data-contrast=\"auto\">Qualified<\/span><span data-contrast=\"auto\">&nbsp;Architecture to Improve Learning (QUAIL)<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:259}\">&nbsp;<\/span><\/td><td>\n<ul>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"1\" data-aria-level=\"1\"><span data-contrast=\"auto\">Ensemble method to improve the utility of synthetic differentially private datasets for machine learning tasks<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"2\" data-aria-posinset=\"2\" data-aria-level=\"1\"><span data-contrast=\"auto\">Combines a differentially private synthesizer and an embedded differentially private supervised learning model to produce a flexible synthetic data set with high machine learning utility<\/span><span data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559739&quot;:120,&quot;335559740&quot;:254}\">&nbsp;<\/span><\/li>\n<\/ul>\n<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p>Check out our <a href=\"https:\/\/arxiv.org\/abs\/2011.05537\" target=\"_blank\" rel=\"noopener\">research paper<\/a> to learn more about synthesizers and their performance in machine learning scenarios.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"learn-more-about-differential-privacy\">Learn more about differential privacy<\/h2>\n\n\n\n<p>Data protection in companies, government authorities, research institutions, and other organizations is a joint effort that involves various roles, including analysts, data scientists, data privacy officers, decision-makers, regulators, and lawyers.<\/p>\n\n\n\n<p>To make the highly effective but not always intuitive concept of differential privacy accessible to a broad audience, we have released a <a href=\"https:\/\/azure.microsoft.com\/en-us\/resources\/microsoft-smartnoisedifferential-privacy-machine-learning-case-studies\" target=\"_blank\" rel=\"noopener\">comprehensive whitepaper<\/a> about the technique and its practical applications. In the paper, you can learn about the underestimated risks of common data anonymization practices, the idea behind differential privacy, and how to use SmartNoise in practice. Furthermore, we assess different levels of privacy protection and their impact on statistical results.<\/p>\n\n\n\n<p>The following example compares the distribution of 50,000 income data points to differentially private histograms of the same data, each generated at different privacy budgets (controlled by the parameter epsilon).<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/Picture15.png\" alt=\"Comparison of histograms for California income distribution. Original (unprotected) histogram plus three differentially private versions, each with a different privacy parameter. Overall, the histograms are very consistent. Minor deviations are visible when the level of protection is the highest (high amount of random noise).\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>Lower epsilon values lead to a higher degree of protection and are therefore also associated with a more intense statistical noise. Even in the aggressive privacy setting with an epsilon value of 0.05, the differentially private distribution reflects the original histogram quite well. It turns out, that the error can be reduced further by increasing the amount of data.<\/p>\n\n\n\n<p>Accompanying the whitepaper, several <a href=\"https:\/\/github.com\/opendifferentialprivacy\/smartnoise-samples\/tree\/master\/whitepaper-demos\" target=\"_blank\" rel=\"noopener\">Jupyter notebooks<\/a> are available for you to experience SmartNoise and other differential privacy technologies in practice and adapt them to your use cases. The demo scenarios range from protecting personal data against privacy attacks, providing basic statistics to advanced machine learning and deep learning applications.<\/p>\n\n\n<div class=\"wp-block-media-text is-stacked-on-mobile\" style=\"grid-template-columns:25% auto\"><figure class=\"wp-block-media-text__media\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS.webp\" alt=\"Protecting Statistics Against Reconstruction Attacks.\" class=\"wp-image-85214 size-full webp-format\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS.jpg\"><\/figure><div class=\"wp-block-media-text__content\">\n<p><\/p><p style=\"text-align: left\"><b><span data-contrast=\"auto\">Protecting Statistics Against Reconstruction Attacks<\/span><\/b><span data-ccp-props='{\"201341983\":0,\"335559740\":259}'>&nbsp;<\/span><\/p>\n\n\n\n<p><\/p><p><span data-contrast=\"auto\">Learn how attackers might reconstruct sensitive income information based on released summary statistics. SmartNoise can help you protect <\/span><\/p>\n<\/div><\/div>\n\n\n<div class=\"wp-block-media-text is-stacked-on-mobile\" style=\"grid-template-columns:25% auto\"><figure class=\"wp-block-media-text__media\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS-2.webp\" alt=\"Protecting Sensitive Data Against Re-Identification Attacks.\" class=\"wp-image-85217 size-full webp-format\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS-2.jpg\"><\/figure><div class=\"wp-block-media-text__content\">\n<p><\/p><p style=\"text-align: left\"><span data-contrast=\"auto\"><b><span data-contrast=\"auto\">Protecting Sensitive Data Against Re-Identification Attacks<\/span><\/b><\/span><\/p>\n\n\n\n<p><\/p><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\">Learn how attackers might combine an anonymized medical dataset with other available data to identify patients. See how to use SmartNoise to protect personal data against re-identification attacks.&nbsp;<\/span><\/p><\/span><\/p>\n<\/div><\/div>\n\n\n<div class=\"wp-block-media-text is-stacked-on-mobile\" style=\"grid-template-columns:25% auto\"><figure class=\"wp-block-media-text__media\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS-3.webp\" alt=\"Privacy-Preserving Statistical Analysis.&nbsp;\" class=\"wp-image-85220 size-full webp-format\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS-3.jpg\"><\/figure><div class=\"wp-block-media-text__content\">\n<p><\/p><p style=\"text-align: left\"><span data-contrast=\"auto\"><b><span data-contrast=\"auto\"><b><span data-contrast=\"auto\">Privacy-Preserving Statistical Analysis<\/span><\/b><\/span><\/b><\/span><\/p>\n\n\n\n<p><\/p><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\">Learn how to use SmartNoise to disclose statistical reports with the&nbsp;differential&nbsp;privacy concept.&nbsp;<\/span><\/p><\/span><\/p><\/span><\/p>\n\n\n\n<p> <span style=\"font-size: revert\">Understand how different levels of privacy guarantees and data set sizes impact statistical accuracy.&nbsp;<\/span><\/p>\n<\/div><\/div>\n\n\n<div class=\"wp-block-media-text is-stacked-on-mobile\" style=\"grid-template-columns:25% auto\"><figure class=\"wp-block-media-text__media\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS-4.webp\" alt=\"Machine Learning Using a Differentially Private Classifier.&nbsp;\" class=\"wp-image-85223 size-full webp-format\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS-4.jpg\"><\/figure><div class=\"wp-block-media-text__content\">\n<p><\/p><p style=\"text-align: left\"><span data-contrast=\"auto\"><b><span data-contrast=\"auto\"><b><span data-contrast=\"auto\"><b><span data-contrast=\"auto\">Machine Learning Using a Differentially Private Classifier<\/span><\/b><\/span><\/b><\/span><\/b><\/span><\/p>\n\n\n\n<p><\/p><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\">Check out different options to perform differentially private machine learning for a classification task. Experience how different levels of privacy&nbsp;guarantees,&nbsp;and data set sizes affect model quality.&nbsp;<\/span><\/p><\/span><\/p><\/span><\/p><\/span><\/p>\n<\/div><\/div>\n\n\n<div class=\"wp-block-media-text is-stacked-on-mobile\" style=\"grid-template-columns:25% auto\"><figure class=\"wp-block-media-text__media\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS-5.webp\" alt=\"Generating a Synthetic Dataset for Privacy-Preserving Machine Learning.&nbsp;\" class=\"wp-image-85226 size-full webp-format\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS-5.jpg\"><\/figure><div class=\"wp-block-media-text__content\">\n<p><\/p><p style=\"text-align: left\"><span data-contrast=\"auto\"><b><span data-contrast=\"auto\"><b><span data-contrast=\"auto\"><b><span data-contrast=\"auto\"><b><span data-contrast=\"auto\">Generating a Synthetic Dataset for Privacy-Preserving Machine Learning<\/span><\/b><\/span><\/b><\/span><\/b><\/span><\/b><\/span><\/p>\n\n\n\n<p><\/p><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\">See how SmartNoise can be used to generate a differentially private dataset that can be disclosed without privacy concerns. Check out how the synthetic dataset can be used for machine learning.&nbsp;<\/span><\/p><\/span><\/p><\/span><\/p><\/span><\/p><\/span><\/p>\n<\/div><\/div>\n\n\n<div class=\"wp-block-media-text is-stacked-on-mobile\" style=\"grid-template-columns:25% auto\"><figure class=\"wp-block-media-text__media\"><img decoding=\"async\" src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS-6.webp\" alt=\"Detect Pneumonia in X-Ray Images while Protecting Patients' Privacy.&nbsp;\" class=\"wp-image-85229 size-full webp-format\" srcset=\"\" data-orig-src=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2021\/02\/OSS-6.jpg\"><\/figure><div class=\"wp-block-media-text__content\">\n<p><\/p><p style=\"text-align: left\"><span data-contrast=\"auto\"><b><span data-contrast=\"auto\"><b><span data-contrast=\"auto\"><b><span data-contrast=\"auto\"><b><span data-contrast=\"auto\"><b><span data-contrast=\"auto\">Detect Pneumonia in X-Ray Images while Protecting Patients&#8217; Privacy<\/span><\/b><span data-ccp-props='{\"201341983\":0,\"335559740\":259}'>&nbsp;<\/span><\/span><\/b><\/span><\/b><\/span><\/b><\/span><\/b><\/span><\/p>\n\n\n\n<p><\/p><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\"><p><span data-contrast=\"auto\">Discover how to perform differentially private deep learning by analyzing medical images.&nbsp;<\/span><\/p><\/span><\/p><\/span><\/p><\/span><\/p><\/span><\/p><\/span><\/p>\n<\/div><\/div>\n\n\n\n<p><\/p>\n\n\n\n<p>To make the differential privacy concept generally understandable, we refrain from discussing the underlying mathematical concepts. Rather, we seek to keep the technical descriptions at a high level. Nonetheless, we recommend that readers have background knowledge about and understand machine learning concepts.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"join-the-smartnoise-early-adopter-acceleration-program\">Join the SmartNoise Early Adopter Acceleration&nbsp;Program<\/h2>\n\n\n\n<p>We have introduced the&nbsp;SmartNoise Early Adopter Acceleration Program to support the usage and adoption of SmartNoise and OpenDP. This collaborative program with the SmartNoise team aims to accelerate the adoption of differential privacy in solutions today that will open data and offer insights to benefit society.<\/p>\n\n\n\n<p>If you have a project that would benefit from using differential privacy, we invite you to&nbsp;apply.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Watch&nbsp;our webinar&nbsp;on&nbsp;Open Data Science Conference&nbsp; Read&nbsp;the white paper on SmartNoise Differential Privacy&nbsp;machine learning case studies The COVID-19 pandemic demonstrates the tremendous importance of sufficient and relevant data for research, causal analysis, government action, and medical progress. However, for understandable data protection considerations, individuals and decision-makers are often very reluctant to share personal or sensitive data.<\/p>\n","protected":false},"author":5562,"featured_media":95481,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"msxcm_post_with_no_image":false,"ep_exclude_from_search":false,"_classifai_error":"","_classifai_text_to_speech_error":"","footnotes":""},"post_tag":[],"content-type":[],"topic":[2238],"programming-languages":[],"coauthors":[1760],"class_list":["post-84863","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","topic-ai-machine-learning","review-flag-alway-1593580310-39","review-flag-machi-1680214156-53","review-flag-new-1593580248-669"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.2 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Create privacy-preserving synthetic data for machine learning with SmartNoise | Microsoft Open Source Blog<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Create privacy-preserving synthetic data for machine learning with SmartNoise | Microsoft Open Source Blog\" \/>\n<meta property=\"og:description\" content=\"Watch&nbsp;our webinar&nbsp;on&nbsp;Open Data Science Conference&nbsp; Read&nbsp;the white paper on SmartNoise Differential Privacy&nbsp;machine learning case studies The COVID-19 pandemic demonstrates the tremendous importance of sufficient and relevant data for research, causal analysis, government action, and medical progress. However, for understandable data protection considerations, individuals and decision-makers are often very reluctant to share personal or sensitive data.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/\" \/>\n<meta property=\"og:site_name\" content=\"Microsoft Open Source Blog\" \/>\n<meta property=\"article:published_time\" content=\"2021-02-18T17:00:03+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-06-23T17:51:30+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/MSC24-ASEAN-developer-Getty-1451309464-rgb.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1170\" \/>\n\t<meta property=\"og:image:height\" content=\"640\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Andreas Kopp\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@OpenAtMicrosoft\" \/>\n<meta name=\"twitter:site\" content=\"@OpenAtMicrosoft\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Andreas Kopp\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 min read\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/\"},\"author\":[{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/author\/andreas-kopp\/\",\"@type\":\"Person\",\"@name\":\"Andreas Kopp\"}],\"headline\":\"Create privacy-preserving synthetic data for machine learning with SmartNoise\",\"datePublished\":\"2021-02-18T17:00:03+00:00\",\"dateModified\":\"2025-06-23T17:51:30+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/\"},\"wordCount\":1287,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/MSC24-ASEAN-developer-Getty-1451309464-rgb.webp\",\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/\",\"url\":\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/\",\"name\":\"Create privacy-preserving synthetic data for machine learning with SmartNoise | Microsoft Open Source Blog\",\"isPartOf\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/MSC24-ASEAN-developer-Getty-1451309464-rgb.webp\",\"datePublished\":\"2021-02-18T17:00:03+00:00\",\"dateModified\":\"2025-06-23T17:51:30+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#primaryimage\",\"url\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/MSC24-ASEAN-developer-Getty-1451309464-rgb.webp\",\"contentUrl\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/MSC24-ASEAN-developer-Getty-1451309464-rgb.webp\",\"width\":1170,\"height\":640},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/opensource.microsoft.com\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Create privacy-preserving synthetic data for machine learning with SmartNoise\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#website\",\"url\":\"https:\/\/opensource.microsoft.com\/blog\/\",\"name\":\"Microsoft Open Source Blog\",\"description\":\"Open dialogue about openness at Microsoft \u2013 open source, standards, interoperability\",\"publisher\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/opensource.microsoft.com\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#organization\",\"name\":\"Microsoft Open Source Blog\",\"url\":\"https:\/\/opensource.microsoft.com\/blog\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2019\/08\/Microsoft-Logo.png\",\"contentUrl\":\"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2019\/08\/Microsoft-Logo.png\",\"width\":259,\"height\":194,\"caption\":\"Microsoft Open Source Blog\"},\"image\":{\"@id\":\"https:\/\/opensource.microsoft.com\/blog\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/OpenAtMicrosoft\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Create privacy-preserving synthetic data for machine learning with SmartNoise | Microsoft Open Source Blog","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/","og_locale":"en_US","og_type":"article","og_title":"Create privacy-preserving synthetic data for machine learning with SmartNoise | Microsoft Open Source Blog","og_description":"Watch&nbsp;our webinar&nbsp;on&nbsp;Open Data Science Conference&nbsp; Read&nbsp;the white paper on SmartNoise Differential Privacy&nbsp;machine learning case studies The COVID-19 pandemic demonstrates the tremendous importance of sufficient and relevant data for research, causal analysis, government action, and medical progress. However, for understandable data protection considerations, individuals and decision-makers are often very reluctant to share personal or sensitive data.","og_url":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/","og_site_name":"Microsoft Open Source Blog","article_published_time":"2021-02-18T17:00:03+00:00","article_modified_time":"2025-06-23T17:51:30+00:00","og_image":[{"width":1170,"height":640,"url":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/MSC24-ASEAN-developer-Getty-1451309464-rgb.png","type":"image\/png"}],"author":"Andreas Kopp","twitter_card":"summary_large_image","twitter_creator":"@OpenAtMicrosoft","twitter_site":"@OpenAtMicrosoft","twitter_misc":{"Written by":"Andreas Kopp","Est. reading time":"5 min read"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#article","isPartOf":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/"},"author":[{"@id":"https:\/\/opensource.microsoft.com\/blog\/author\/andreas-kopp\/","@type":"Person","@name":"Andreas Kopp"}],"headline":"Create privacy-preserving synthetic data for machine learning with SmartNoise","datePublished":"2021-02-18T17:00:03+00:00","dateModified":"2025-06-23T17:51:30+00:00","mainEntityOfPage":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/"},"wordCount":1287,"commentCount":0,"publisher":{"@id":"https:\/\/opensource.microsoft.com\/blog\/#organization"},"image":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#primaryimage"},"thumbnailUrl":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/MSC24-ASEAN-developer-Getty-1451309464-rgb.webp","inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/","url":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/","name":"Create privacy-preserving synthetic data for machine learning with SmartNoise | Microsoft Open Source Blog","isPartOf":{"@id":"https:\/\/opensource.microsoft.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#primaryimage"},"image":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#primaryimage"},"thumbnailUrl":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/MSC24-ASEAN-developer-Getty-1451309464-rgb.webp","datePublished":"2021-02-18T17:00:03+00:00","dateModified":"2025-06-23T17:51:30+00:00","breadcrumb":{"@id":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#primaryimage","url":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/MSC24-ASEAN-developer-Getty-1451309464-rgb.webp","contentUrl":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2024\/06\/MSC24-ASEAN-developer-Getty-1451309464-rgb.webp","width":1170,"height":640},{"@type":"BreadcrumbList","@id":"https:\/\/opensource.microsoft.com\/blog\/2021\/02\/18\/create-privacy-preserving-synthetic-data-for-machine-learning-with-smartnoise\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/opensource.microsoft.com\/blog\/"},{"@type":"ListItem","position":2,"name":"Create privacy-preserving synthetic data for machine learning with SmartNoise"}]},{"@type":"WebSite","@id":"https:\/\/opensource.microsoft.com\/blog\/#website","url":"https:\/\/opensource.microsoft.com\/blog\/","name":"Microsoft Open Source Blog","description":"Open dialogue about openness at Microsoft \u2013 open source, standards, interoperability","publisher":{"@id":"https:\/\/opensource.microsoft.com\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/opensource.microsoft.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/opensource.microsoft.com\/blog\/#organization","name":"Microsoft Open Source Blog","url":"https:\/\/opensource.microsoft.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/opensource.microsoft.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2019\/08\/Microsoft-Logo.png","contentUrl":"https:\/\/opensource.microsoft.com\/blog\/wp-content\/uploads\/2019\/08\/Microsoft-Logo.png","width":259,"height":194,"caption":"Microsoft Open Source Blog"},"image":{"@id":"https:\/\/opensource.microsoft.com\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/OpenAtMicrosoft"]}]}},"msxcm_display_generated_audio":false,"msxcm_animated_featured_image":null,"distributor_meta":false,"distributor_terms":false,"distributor_media":false,"distributor_original_site_name":"Microsoft Open Source Blog","distributor_original_site_url":"https:\/\/opensource.microsoft.com\/blog","push-errors":false,"_links":{"self":[{"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/posts\/84863","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/users\/5562"}],"replies":[{"embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/comments?post=84863"}],"version-history":[{"count":8,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/posts\/84863\/revisions"}],"predecessor-version":[{"id":97601,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/posts\/84863\/revisions\/97601"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/media\/95481"}],"wp:attachment":[{"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/media?parent=84863"}],"wp:term":[{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/post_tag?post=84863"},{"taxonomy":"content-type","embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/content-type?post=84863"},{"taxonomy":"topic","embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/topic?post=84863"},{"taxonomy":"programming-languages","embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/programming-languages?post=84863"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/opensource.microsoft.com\/blog\/wp-json\/wp\/v2\/coauthors?post=84863"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}