<tt id="6hsgl"><pre id="6hsgl"><pre id="6hsgl"></pre></pre></tt>
          <nav id="6hsgl"><th id="6hsgl"></th></nav>
          国产免费网站看v片元遮挡,一亚洲一区二区中文字幕,波多野结衣一区二区免费视频,天天色综网,久久综合给合久久狠狠狠,男人的天堂av一二三区,午夜福利看片在线观看,亚洲中文字幕在线无码一区二区
          Global EditionASIA 中文雙語Fran?ais
          Opinion
          Home / Opinion / Op-Ed Contributors

          Beware of health-tech firms' snake oil

          By Leeza Osipenko | China Daily | Updated: 2020-08-08 08:34
          Share
          Share - WeChat

          In an interview with The Wall Street Journal earlier this year, David Feinberg, the head of Google Health and a self-professed astrology buff, said: "If you believe me that all we are doing is organizing information to make it easier for your doctor, I'm going to get a little paternalistic here: I'm never going to let that get opted out." In other words, patients will soon have no choice but to receive personalized clinical horoscopes based on their own medical histories and inferences drawn from a growing pool of patient records.

          But even if we want such a world, we should take a hard look at what today's health-tech proponents are really selling.

          How true is promise to cut medical costs?

          In recent years, most of the US Big Tech companies-along with many start-ups, the big pharmaceutical companies and others-have entered the health-tech sector. With big data analytics, artificial intelligence (AI) and other novel methods, they promise to cut costs for struggling healthcare systems, revolutionize how doctors make medical decisions, and save us from ourselves.

          What could possibly go wrong? Quite a lot, it turns out. In Weapons of Mass Destruction, data scientist Cathy O'Neil lists many examples of how algorithms and data can fail us in unsuspecting ways. When transparent data-feedback algorithms were applied to baseball, they worked better than expected; but when similar models are used in finance, insurance, law enforcement and education, they can be highly discriminatory and destructive.

          Healthcare is no exception. Individuals' medical data are susceptible to subjective clinical decision-making, medical errors and evolving practices, and the quality of larger data sets is often diminished by missing records, measurement errors, and a lack of structure and standardization.

          Nonetheless, the big data revolution in healthcare is being sold as if these troubling limitations did not exist. Worse, many medical decision-makers are falling for the hype.

          No infrastructure to gather evidence

          One could argue that as long as new solutions offer some benefits, they are worth it. But we cannot really know whether data analytics and AI actually do improve on the status quo without large, well-designed empirical studies.

          Not only is such evidence lacking; there is no infrastructure or regulatory framework in place to generate it. Big-data applications are simply being introduced into healthcare settings as if they were harmless or unquestionably beneficial.

          Consider Project Nightingale, a private data-sharing arrangement between Google Health and Ascension, a massive nonprofit health system in the United States. When The Wall Street Journal first reported on this secret relationship last November, it triggered a scandal over concerns about patient data and privacy. Worse, as Feinberg openly admitted to the same newspaper just two months later, "We didn't know what we were doing."

          Given that the Big Tech companies have no experience in healthcare, such admissions should come as no surprise, despite the attempts to reassure us otherwise. Worse, at a time when individual privacy is becoming more of a luxury than a right, the algorithms that are increasingly ruling our lives are becoming inaccessible black boxes, shielded from public or regulatory scrutiny to protect corporate interests. And in the case of healthcare, algorithmic diagnostic and decision models sometimes return results that doctors themselves do not understand.

          Unethical and poorly informed

          Although many of those pouring into the health-tech arena are well intentioned, the industry's current approach is fundamentally unethical and poorly informed. No one objects to improving health care with technology. But before rushing into partnerships with tech companies, healthcare executives and providers need to improve their understanding of the healthtech field.

          For starters, it is critical to remember that big data inferences are gleaned through statistics and mathematics, which demand their own form of literacy. When an algorithm detects "causality" or some other association signal, that information can be valuable for conducting further hypothesis-driven investigations. But when it comes to actual decision-making, mathematically driven predictive models are only as reliable as the data being fed into them. And since their fundamental assumptions are based on what is already known, they offer a view of the past and the present, not the future. Such applications have far-reaching potential to improve healthcare and cut costs; but those gains are not guaranteed.

          Another critical area is AI, which requires both its own architecture-that is, the rules and basic logic that determine how the system operates-and access to massive amounts of potentially sensitive data. The goal is to position the system so that it can "teach" itself how to deliver optimal solutions to stated problems. But, here, one must remember that the creators of the architecture-the people writing the rules and articulating the problems-are as biased as anyone else, whether they mean to be or not.

          Moreover, as with data analytics, AI systems are guided by data from the current healthcare system, making them prone to replicating their own failures and successes.

          Algorithms should be shared with regulators

          At the end of the day, improving healthcare through big data and AI will likely take much more trial and error than techno-optimists realize. If conducted transparently and publicly, big-data projects can teach us how to create high quality data sets prospectively, thereby increasing algorithmic solutions' chances of success. By the same token, the algorithms themselves should be made available at least to regulators and the organizations subscribing to the service, if not to the public.

          Above all, healthcare providers and governments should remove their rose-tinted glasses and think critically about the implications of largely untested new applications in healthcare. Rather than simply giving away patient records and other data, hospitals and regulators should shadow the tech sector developers who are designing the architecture and deploying experimental new systems. More people need to be offering feedback and questioning the assumptions underlying initial prototypes, and this must be followed by controlled experiments to assess these technologies' real-world performance.

          Having been massively over-hyped, big data healthcare solutions are being pushed into the market without meaningful regulation, transparency, standardization, accountability or robust validation practices.

          Patients deserve health systems and providers that will protect them, rather than using them as mere sources of data for profit-driven experiments.

          The author is a senior lecturer in practice in the Department of Health Policy at the London School of Economics and Political Science. The views don't necessarily represent those of China Daily.
          Project Syndicate

          Most Viewed in 24 Hours
          Top
          BACK TO THE TOP
          English
          Copyright 1994 - . All rights reserved. The content (including but not limited to text, photo, multimedia information, etc) published in this site belongs to China Daily Information Co (CDIC). Without written authorization from CDIC, such content shall not be republished or used in any form. Note: Browsers with 1024*768 or higher resolution are suggested for this site.
          License for publishing multimedia online 0108263

          Registration Number: 130349
          FOLLOW US
          主站蜘蛛池模板: 午夜DY888国产精品影院| 亚洲男人综合久久综合天堂| 中文熟妇人妻av在线| 看全色黄大黄大色免费久久| 2019香蕉在线观看直播视频| 国产精品人成视频免费国产| 国产亚洲精品第一综合另类| 青草亚洲地区在线视频| 色九月亚洲综合网| 免费精品国产人妻国语色戒| 日韩乱码卡一卡2卡三卡四| 亚洲自拍偷拍激情视频| 在线看免费无码的av天堂| 黑人大荫道bbwbbb高潮潮喷| 国产综合久久久久久鬼色| 熟妇人妻av中文字幕老熟妇 | 91精品伊人久久大香线蕉| 午夜DY888国产精品影院| 亚洲AV无码精品色欲av| 亚洲熟女乱一区二区三区| 1024你懂的国产精品| 无码综合天天久久综合网| 午夜DY888国产精品影院| 欧美福利在线| 亚洲无码a∨在线视频| 在线国产毛片| 亚洲人成在久久综合网站| 成人啪精品视频网站午夜| 欧美熟妇性XXXX欧美熟人多毛| 无码av最新无码av专区| 亚洲精品韩国一区二区| 国产成人久久综合一区| 午夜福利精品国产二区| 午夜福利不卡片在线播放免费 | 中文字幕久久国产精品| A男人的天堂久久A毛片 | 在线免费播放av日韩| 色综合久久中文综合久久激情| 大香j蕉75久久精品免费8| 中文字幕不卡在线播放| 国产精品久久久久AV福利动漫|