Fixing the internet requires a culture change, says Fran Berman – Harvard Gazette

In this series, the Gazette asks Harvard experts for concrete solutions to complex problems. Francine Berman, Edward P. Hamilton Distinguished Professor of Computer Science at Rensselaer Polytechnic Institute, is Associate Professor at the Berkman Klein Center for Internet & Society. Berman’s current work focuses on the social and environmental impacts of information technology, and in particular the Internet of Things – a deeply interconnected ecosystem of billions of everyday objects linked by the Web.

GAZETTE: Do you think the internet has been a force for good in the world?

BERMAN: Yes and no. What the Internet and information technology have given us is enormous power. Technology has become an essential infrastructure for modern life. It saved our lives during the pandemic, providing the only way for many to go to school, work, or see family and friends. It also allowed for the manipulation of elections, the rapid spread of disinformation and the growth of radicalism.

Are digital technologies good or bad? The same Internet supports both Pornhub and CDC.gov, Goodreads and Parler.com. The digital world we live in is a fusion of technological innovation and social controls. For cyberspace to be a force for good, it will take a societal shift in the way we develop, use and oversee technology, a redefinition of the priorities of public interest over private profit.

Fundamentally, it is the responsibility of the public sector to create the social controls that encourage the use of technology for good rather than exploitation, manipulation, misinformation and worse. This is extremely complex and requires a shift in the larger culture from technological opportunism to a culture of technology in the public interest.

GAZETTE: How to change the culture of technological opportunism?

BERMAN: There are no quick fixes to creating this culture change – no law, no federal agency, no institutional policy or set of practices will do it, even if all are necessary. It is a long and difficult task. Shifting from a culture of technological opportunism to a culture of technology in the public interest will require many and sustained efforts on a number of fronts, just as we are currently experiencing as we work hard to move from culture of discrimination to a culture of inclusion. .

That being said, we need to create the building blocks of culture change now – proactive short-term solutions, foundational long-term solutions, and serious efforts to strategize for challenges we don’t yet know how to tackle.

In the short term, the government must take the lead. There are plenty of horror stories – fake arrests based on poor facial recognition, data-traded rape victim lists, intruders yelling at babies from connected baby monitors – but there are surprisingly little consensus on digital protections – specific expectations for privacy, security, security, etc. – US citizens should have.

We have to sort it out. The European General Data Protection Regulation (GDPR) is based on a well-articulated set of digital rights for citizens of the European Union. In the United States, we have some specific digital rights – the privacy of health and financial data, the privacy of children’s online data – but these rights are largely fragmentary. What are consumers’ digital privacy rights? What are the security and safety expectations for digital devices and systems used as critical infrastructure?

Specificity is important here because to be effective, social protections must be integrated into technical architectures. If a federal law were passed tomorrow requiring consumers to opt for the collection of personal data by consumer digital services, Google and Netflix would have to change their systems (and business models) to allow users that kind of discretion. There would be trade-offs for consumers who wouldn’t agree: Google’s search would become more generic, and Netflix’s recommendations wouldn’t be well suited to your interests. But there would also be advantages: Acceptance rules put consumers in the driver’s seat and give them greater control over the confidentiality of their information.

Once a basic set of digital rights for citizens is specified, a federal agency should be created with regulatory and enforcement power to protect these rights. The FDA was created to promote the safety of our foods and medicines. OSHA was created to promote the safety of our workplaces. Today the public is looking more at the safety of lettuce you buy at the grocery store than the safety of software you download from the Internet. Bills in Congress that call for a data protection agency, similar to the data protection authorities required by the GDPR, could create much needed oversight and enforcement of digital protections in cyberspace.

Additional legislation that penalizes businesses, rather than consumers, for failing to protect consumers’ digital rights could also do more to incentivize the private sector to promote the public interest. If your credit card is stolen, the business, not the cardholder, is largely paying the price. Penalizing companies with hefty fines and holding company staff legally accountable – especially those in the C suite – is a strong incentive for companies to step up consumer protection. Refocusing corporate priorities would make a positive contribution to moving us from a culture of technological opportunism to a culture of technology in the public interest.

GAZETTE: Is specific legislation needed to address some of today’s toughest challenges – fake news on social media, fake news, etc.?

BERMAN: It’s hard to solve problems online that you haven’t solved in the real world. In addition, legislation is not helpful if the solution is not clear. At the root of our misinformation and fake news problems online lies the enormous challenge of automating trust, truth, and ethics.

Social media largely takes away the context of information and, with it, many clues that allow us to verify what we hear. Online, we probably don’t know who we’re talking to or where they got their information from. There is a lot of stacking. In real life, we have ways to verify information, assess context credentials, and use conversation dynamics to assess what we are hearing. Few of these elements are present on social networks.


Source link

About Perry Perrie

Perry Perrie

Check Also

118-Year-Old Tsingtao Brewery Raises the Bar in Defining Premium Beer, While Leading Industry Transformation

Tsingtao Brewery, jointly founded in Qingdao in August 1903 by German and British merchants, is …

Leave a Reply

Your email address will not be published. Required fields are marked *