{"id":144034,"date":"2024-12-07T15:13:44","date_gmt":"2024-12-07T13:13:44","guid":{"rendered":"https:\/\/thecuriousbrain.com\/?p=144034"},"modified":"2024-12-07T15:19:39","modified_gmt":"2024-12-07T13:19:39","slug":"when-machines-discriminate-the-rise-of-ai-bias-in-modern-life","status":"publish","type":"post","link":"https:\/\/thecuriousbrain.com\/?p=144034","title":{"rendered":"When Machines Discriminate: The Rise of AI Bias in Modern Life"},"content":{"rendered":"\n<figure class=\"wp-block-image size-large\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"922\" height=\"484\" src=\"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/Screenshot-2024-12-06-at-13.22.25.png?resize=922%2C484&#038;ssl=1\" alt=\"\" class=\"wp-image-144035\" srcset=\"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/Screenshot-2024-12-06-at-13.22.25.png?resize=922%2C484&amp;ssl=1 922w, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/Screenshot-2024-12-06-at-13.22.25.png?resize=768%2C404&amp;ssl=1 768w, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/Screenshot-2024-12-06-at-13.22.25.png?resize=920%2C483&amp;ssl=1 920w, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/Screenshot-2024-12-06-at-13.22.25.png?w=1024&amp;ssl=1 1024w\" sizes=\"auto, (max-width: 922px) 100vw, 922px\" \/><\/figure>\n\n\n\n<p>Imagine applying for a job and receiving <a href=\"https:\/\/www.forbes.com\/sites\/rachelwells\/2024\/10\/27\/65-of-employers-to-use-ai-to-reject-candidates-in-2025\/#:~:text=Just%20when%20you%20thought%20the,recent%20Resume%20Builder%20survey%20of\">a rejection letter\u2014not from a person, but from an algorithm<\/a>. It doesn\u2019t explain why, but behind the scenes, the system decided your resume didn\u2019t \u201cfit.\u201d Perhaps you attended an all-women\u2019s college or used a word like \u201ccollaborative\u201d that it flagged as \u201cunqualified.\u201d<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<span class=\"embed-youtube\" style=\"text-align:center; display: block;\"><iframe loading=\"lazy\" class=\"youtube-player\" width=\"922\" height=\"519\" src=\"https:\/\/www.youtube.com\/embed\/QvRZuHQBTps?version=3&#038;rel=1&#038;showsearch=0&#038;showinfo=1&#038;iv_load_policy=1&#038;fs=1&#038;hl=en-US&#038;autohide=2&#038;wmode=transparent\" allowfullscreen=\"true\" style=\"border:0;\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox\"><\/iframe><\/span>\n<\/div><\/figure>\n\n\n\n<p>This isn\u2019t a dystopian nightmare\u2014it\u2019s a reality that unfolded at Amazon, where an AI-powered recruiting tool systematically discriminated against female applicants. The system, trained on historical data dominated by male hires, penalized words and phrases commonly associated with women, forcing the company to scrap it entirely.<\/p>\n\n\n\n<p>But the tool\u2019s failure wasn\u2019t a one-off glitch. It\u2019s a stark example of a growing problem: artificial intelligence isn\u2019t neutral. And as it becomes more embedded in everyday life, its biases are shaping decisions that affect millions.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Bias at Scale: How AI Replicates Our Flaws<\/strong><\/h3>\n\n\n\n<p>AI systems learn from the data they\u2019re given. And when that data reflects existing inequalities\u2014whether in hiring, healthcare, or policing\u2014the algorithms amplify them.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Hiring Discrimination<\/strong>: Amazon\u2019s AI recruitment tool penalized resumes with words like \u201cwomen\u2019s\u201d or references to all-female institutions, mirroring biases in its training data. While Amazon pulled the plug on the tool, its case became a cautionary tale of how unchecked AI can institutionalize discrimination.<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<span class=\"embed-youtube\" style=\"text-align:center; display: block;\"><iframe loading=\"lazy\" class=\"youtube-player\" width=\"922\" height=\"519\" src=\"https:\/\/www.youtube.com\/embed\/Bxpx8izG5nA?version=3&#038;rel=1&#038;showsearch=0&#038;showinfo=1&#038;iv_load_policy=1&#038;fs=1&#038;hl=en-US&#038;autohide=2&#038;wmode=transparent\" allowfullscreen=\"true\" style=\"border:0;\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox\"><\/iframe><\/span>\n<\/div><\/figure>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Facial Recognition Failures<\/strong>: In Michigan, Robert Julian-Borchak Williams was wrongfully arrested after a police facial recognition system falsely identified him as a suspect. Studies have repeatedly shown that facial recognition tools are less accurate for people of color, leading to disproportionate harm.<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<span class=\"embed-youtube\" style=\"text-align:center; display: block;\"><iframe loading=\"lazy\" class=\"youtube-player\" width=\"922\" height=\"519\" src=\"https:\/\/www.youtube.com\/embed\/FvGkjWnYKYs?version=3&#038;rel=1&#038;showsearch=0&#038;showinfo=1&#038;iv_load_policy=1&#038;fs=1&#038;hl=en-US&#038;autohide=2&#038;wmode=transparent\" allowfullscreen=\"true\" style=\"border:0;\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox\"><\/iframe><\/span>\n<\/div><\/figure>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Healthcare Inequality<\/strong>: An algorithm used in U.S. hospitals deprioritized Black patients for critical care, underestimating their medical needs because it relied on cost-based metrics. The result? Disparities in access to potentially life-saving treatment.<\/li>\n<\/ul>\n\n\n\n<p>These systems don\u2019t operate in isolation. They scale human bias, codify it, and make it harder to detect and challenge.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>The Perils of Automated Decision-Making<\/strong><\/h3>\n\n\n\n<p>Unlike human errors, algorithmic mistakes carry an air of authority. Decisions made by AI often feel final and unassailable, even when they\u2019re deeply flawed.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Scale<\/strong>: A biased human decision affects one person. A biased algorithm impacts millions.<\/li>\n\n\n\n<li><strong>Opacity<\/strong>: Many algorithms operate as \u201cblack boxes,\u201d their inner workings hidden even from their creators.<\/li>\n\n\n\n<li><strong>Trust<\/strong>: People often assume machines are objective, but AI is only as unbiased as the data it\u2019s trained on\u2014and the priorities of its developers.<\/li>\n<\/ul>\n\n\n\n<p>This makes machine bias uniquely dangerous. When an algorithm decides who gets hired, who gets a loan, or who gets arrested, the stakes are high\u2014and the consequences are often invisible until it\u2019s too late.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Who\u2019s to Blame?<\/strong><\/h3>\n\n\n\n<p><strong>AI doesn\u2019t create bias\u2014it reflects it. But the blame doesn\u2019t lie solely with the machines. It lies with the people and systems that build, deploy, and regulate them.<\/strong><\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>Technology doesn\u2019t just reflect the world we\u2019ve built\u2014it shows us what needs fixing. AI is powerful, but its value lies in how we use it\u2014and who we use it for.<\/p>\n<\/blockquote>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\"><strong> Can AI Be Fair?<\/strong><\/h3>\n\n\n\n<p>The rise of AI bias isn\u2019t inevitable. With intentional action, we can create systems that reduce inequality instead of amplifying it.<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Diverse Data<\/strong>: Train algorithms on datasets that reflect the full spectrum of humanity.<\/li>\n\n\n\n<li><strong>Inclusive Design<\/strong>: Build diverse development teams to catch blind spots and design for fairness.<\/li>\n\n\n\n<li><strong>Transparency<\/strong>: Require companies\/ governments  to open their algorithms to audits and explain their decision-making processes.<\/li>\n\n\n\n<li><strong>Regulation<\/strong>: Establish global standards for ethical AI development, holding organizations accountable for harm.<\/li>\n<\/ol>\n\n\n\n<p><strong>But these solutions require collective will. Without public pressure, the systems shaping our lives will continue to reflect the inequities of the past.<\/strong><\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>The rise of machine bias is a reminder that AI, for all its promise, is a mirror.<\/strong> <\/h3>\n\n\n\n<p>It reflects the values, priorities, and blind spots of the society that creates it.<\/p>\n\n\n\n<p>The question isn\u2019t whether AI will shape the future\u2014it\u2019s whose future it will shape. Will it serve the privileged few, or will it work to dismantle the inequalities it so often reinforces?<\/p>\n\n\n\n<p>The answer lies not in the machines but in us.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><strong>NEVER FORGET ! AI is a tool. Its power isn\u2019t in what it can do\u2014it\u2019s in what we demand of it. If we want a future that\u2019s fair and just, we have to fight for it, all of us<\/strong>!<\/p>\n<\/blockquote>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Imagine applying for a job and receiving a rejection letter\u2014not from a person, but from an algorithm. It doesn\u2019t explain why, but behind the scenes, the system decided your resume didn\u2019t \u201cfit.\u201d Perhaps you attended an all-women\u2019s college or used a word like \u201ccollaborative\u201d that it flagged as \u201cunqualified.\u201d This isn\u2019t a dystopian nightmare\u2014it\u2019s a [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":144039,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[3544,3934,3935,3933,3517],"class_list":["post-144034","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-all-other-stuff","tag-ai","tag-bias","tag-dicrimination","tag-hr","tag-racism"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/when-machines-discriminate-the-r.jpg?fit=480%2C360&ssl=1","jetpack-related-posts":[{"id":137327,"url":"https:\/\/thecuriousbrain.com\/?p=137327","url_meta":{"origin":144034,"position":0},"title":"Should you stream it?","author":"thebrainbehind","date":"08\/04\/2021","format":false,"excerpt":"https:\/\/youtu.be\/jZl55PsfZJQ What does it mean when the technology that surrounds our lives is built on systemic racial and gender-based prejudices? This is the truth about the invisible forces that decide everyday human potential. CODED BIAS follows MIT Media Lab researcher Joy Buolamwini\u2019s startling discovery that many facial recognition technologies fail\u2026","rel":"","context":"In &quot;all other stuff&quot;","block_context":{"text":"all other stuff","link":"https:\/\/thecuriousbrain.com\/?cat=1"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/should-you-stream-it-21.jpg?fit=1200%2C675&ssl=1&resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/should-you-stream-it-21.jpg?fit=1200%2C675&ssl=1&resize=350%2C200 1x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/should-you-stream-it-21.jpg?fit=1200%2C675&ssl=1&resize=525%2C300 1.5x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/should-you-stream-it-21.jpg?fit=1200%2C675&ssl=1&resize=700%2C400 2x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/should-you-stream-it-21.jpg?fit=1200%2C675&ssl=1&resize=1050%2C600 3x"},"classes":[]},{"id":149959,"url":"https:\/\/thecuriousbrain.com\/?p=149959","url_meta":{"origin":144034,"position":1},"title":"OpenAI\u2019s new shopping research feature quietly rewrites the buying journey.","author":"thebrainbehind","date":"26\/11\/2025","format":false,"excerpt":"For consumersDecision fatigue disappears. You describe your needs and get a personalised buyer\u2019s guide that filters, compares, and questions on your behalf. Shopping becomes clarity, not chaos.For marketersThe era of \u201cmore content\u201d is over. If an agent interprets your brand, only the clearest value, strongest proof, and simplest differentiation survive.\u2026","rel":"","context":"In &quot;trends&quot;","block_context":{"text":"trends","link":"https:\/\/thecuriousbrain.com\/?cat=162"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/Screenshot-2025-11-26-at-22.19.16.png?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/Screenshot-2025-11-26-at-22.19.16.png?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/Screenshot-2025-11-26-at-22.19.16.png?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/Screenshot-2025-11-26-at-22.19.16.png?resize=700%2C400&ssl=1 2x"},"classes":[]},{"id":108959,"url":"https:\/\/thecuriousbrain.com\/?p=108959","url_meta":{"origin":144034,"position":2},"title":"The moral bias behind your search results","author":"thebrainbehind","date":"11\/11\/2015","format":false,"excerpt":"The moral bias behind your search\u00a0resultshttps:\/\/embed-ssl.ted.com\/talks\/andreas_ekstrom_the_moral_bias_behind_your_search_results.html Search engines have become our most trusted sources of information and arbiters of truth. But can we ever get an unbiased search result? Swedish author and journalist Andreas Ekstr\u00f6m argues that such a thing is a philosophical impossibility. In this thoughtful talk, he calls\u2026","rel":"","context":"In &quot;all other stuff&quot;","block_context":{"text":"all other stuff","link":"https:\/\/thecuriousbrain.com\/?cat=1"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/the-moral-bias-behind-your-searc.jpg?fit=560%2C315&ssl=1&resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/the-moral-bias-behind-your-searc.jpg?fit=560%2C315&ssl=1&resize=350%2C200 1x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/the-moral-bias-behind-your-searc.jpg?fit=560%2C315&ssl=1&resize=525%2C300 1.5x"},"classes":[]},{"id":146930,"url":"https:\/\/thecuriousbrain.com\/?p=146930","url_meta":{"origin":144034,"position":3},"title":"I asked AI what are the biggest lies humans believe!","author":"thebrainbehind","date":"23\/05\/2025","format":false,"excerpt":"Here is a system of illusions\u2014engraved into culture, commerce, and consciousness\u2014that keeps humanity asleep at the wheel: https:\/\/youtu.be\/uCGD9dT12C0?si=8WADUOx5FXxImV9u I. Personal Myths (Lies of the Self) \"I am what I own.\"Identity is mistaken for inventory. Consumerism replaces soul-searching with shopping. \"I have time.\"The great procrastination spell. Mortality is outsourced to the\u2026","rel":"","context":"In &quot;all other stuff&quot;","block_context":{"text":"all other stuff","link":"https:\/\/thecuriousbrain.com\/?cat=1"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/i-asked-ai-what-are-the-biggest.jpg?fit=1200%2C675&ssl=1&resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/i-asked-ai-what-are-the-biggest.jpg?fit=1200%2C675&ssl=1&resize=350%2C200 1x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/i-asked-ai-what-are-the-biggest.jpg?fit=1200%2C675&ssl=1&resize=525%2C300 1.5x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/i-asked-ai-what-are-the-biggest.jpg?fit=1200%2C675&ssl=1&resize=700%2C400 2x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/i-asked-ai-what-are-the-biggest.jpg?fit=1200%2C675&ssl=1&resize=1050%2C600 3x"},"classes":[]},{"id":78198,"url":"https:\/\/thecuriousbrain.com\/?p=78198","url_meta":{"origin":144034,"position":4},"title":"Kimberly Bryant: Break Down Your Biases","author":"thebrainbehind","date":"07\/12\/2015","format":false,"excerpt":"https:\/\/vimeo.com\/130883094 We all have biases and blind spots, unconsciously affecting the way we collaborate with others. In this 99U talk, Black Girls Code founder Bryant shares how pervasive biases are in our society and how that affects our careers and our culture. \"We must take into account this disparity between\u2026","rel":"","context":"In &quot;PPT\/ cool decks&quot;","block_context":{"text":"PPT\/ cool decks","link":"https:\/\/thecuriousbrain.com\/?cat=221"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":147580,"url":"https:\/\/thecuriousbrain.com\/?p=147580","url_meta":{"origin":144034,"position":5},"title":"The Propaganda Machine: How AI Models Became Strategic Instruments of Influence","author":"thebrainbehind","date":"10\/07\/2025","format":false,"excerpt":"For years, artificial intelligence was framed as a neutral tool\u2014an impartial processor of information. But neutrality was always a convenient myth. The recent Grok controversy shattered that illusion. After Elon Musk's chatbot was reprogrammed to reflect anti-woke ideology, it began producing outputs that were not only politically charged, but overtly\u2026","rel":"","context":"In &quot;all other stuff&quot;","block_context":{"text":"all other stuff","link":"https:\/\/thecuriousbrain.com\/?cat=1"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/npr.brightspotcdn.webp?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/npr.brightspotcdn.webp?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/npr.brightspotcdn.webp?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/npr.brightspotcdn.webp?resize=700%2C400&ssl=1 2x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/npr.brightspotcdn.webp?resize=1050%2C600&ssl=1 3x, https:\/\/i0.wp.com\/thecuriousbrain.com\/wp-content\/uploads\/npr.brightspotcdn.webp?resize=1400%2C800&ssl=1 4x"},"classes":[]}],"jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/thecuriousbrain.com\/index.php?rest_route=\/wp\/v2\/posts\/144034","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/thecuriousbrain.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/thecuriousbrain.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/thecuriousbrain.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/thecuriousbrain.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=144034"}],"version-history":[{"count":3,"href":"https:\/\/thecuriousbrain.com\/index.php?rest_route=\/wp\/v2\/posts\/144034\/revisions"}],"predecessor-version":[{"id":144042,"href":"https:\/\/thecuriousbrain.com\/index.php?rest_route=\/wp\/v2\/posts\/144034\/revisions\/144042"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/thecuriousbrain.com\/index.php?rest_route=\/wp\/v2\/media\/144039"}],"wp:attachment":[{"href":"https:\/\/thecuriousbrain.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=144034"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/thecuriousbrain.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=144034"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/thecuriousbrain.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=144034"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}