Any fair-minded assessment of the danger

admin2022-08-02  21

问题 Any fair-minded assessment of the dangers of the deal between Britain’s National Health Service (NHS) and DeepMind must start by acknowledging that both sides mean well. DeepMind is one of the leading artificial intelligence (AI) companies in the world. The potential of this work applied to healthcare is very great, but it could also lead to further concentration of power in the tech giants. It Is against that background that the information commissioner, Elizabeth Denham, has issued her damning verdict against the Royal Free hospital trust under the NHS, which handed over to DeepMind the records of 1.6 million patients In 2015 on the basis of a vague agreement which took far too little account of the patients’ rights and their expectations of privacy.  DeepMind has almost apologized. The NHS trust has mended its ways. Further arrangements- and there may be many-between the NHS and DeepMind will be carefully scrutinised to ensure that all necessary permissions have been asked of patients and all unnecessary data has been cleaned. There are lessons about informed patient consent to learn. But privacy is not the only angle in this case and not even the most important. Ms. Denham chose to concentrate the blame on the NHS trust, since under existing law it “controlled” the data and DeepMind merely “processed” it. But this distinction misses the point that it is processing and aggregation, not the mere possession of bits, that gives the data value.  The great question is who should benefit from the analysis of all the data that our lives now generate. Privacy law builds on the concept of damage to an individual from identifiable knowledge about them. That misses the way the surveillance economy works. The data of an individual there gains its value only when it is compared with the data of countless millions more.  The use of privacy law to curb the tech giants in this instance feels slightly maladapted. This practice does not address the real worry. It is not enough to say that the algorithms DeepMind develops will benefit patients and save lives. What matters is that they will belong to a private monopoly which developed them using public resources. If software promises to save lives on the scale that dugs now can, big data may be expected to behave as a big pharm has done. We are still at the beginning of this revolution and small choices now may turn out to have gigantic consequences later. A long struggle will be needed to avoid a future of digital feudalism. Ms Denham’s report is a welcome start.The author argues in Paragraph 2 that _____.A. privacy protection must be secured at all costs.B. leaking patients’ data is worse than selling it.C. making profits from patients’ data is illegal.D. the value of data comes from the processing of it.

选项 A. privacy protection must be secured at all costs.
B. leaking patients’ data is worse than selling it.
C. making profits from patients’ data is illegal.
D. the value of data comes from the processing of it.

答案 D

解析 第二段的后半部分,作者介绍了数据的价值所在,即NHS对数据的“占有”和对数据的处理方式(交给DeepMind),提供了数据的价值。故正确答案为D。
转载请注明原文地址:https://tihaiku.com/xueli/2698867.html

最新回复(0)