Are we using technology or are we being used? Q&A with UCLA’s Ramesh Srinivasan

Ramesh Srinivasan studies the relationship between technology, politics and societies. As a professor of information studies and design media arts at UCLA, Srinivasan is an expert on social media, including the more extreme and encrypted platforms like Parler and Telegram. He is also the founder of the University of California’s system-wide Digital Cultures Lab.

Srinivasan has worked with governments, businesses, activists and civil society organizations to advise on technology and the future. His research investigates technology’s relationship to democracy and politics, public health, policy, social change, economic development, distance learning, migration studies and cultural heritage. He has worked in more than 70 countries studying internet/social media technologies, artificial intelligence, big data’s impacts on political life, economic concerns, and cultural and global effects.

His most recent book, “Beyond the Valley? (MIT Press), is the publisher’s top selling book and illustrates the potential for a digital world of the future that supports the interests of environmental sustainability, democracy, workers, cultural diversity, and businesses. The book was named a top 10 book in tech by Forbes.

Srinivasan’s answers have been edited for brevity.

Technology companies waited until the halls of government were breached, people were killed, and the integrity of our government process interrupted before banning the president’s presence on social media. What should government do or what body should regulate these companies in the future so that extremist efforts cannot make it this far?

Technology companies have business models, but those cannot come at the cost of everything else — we need to create a balance between business interests, and the public’s interest. We need third-party accountability that lies outside these companies’ internal employees. We’ve learned more and more how difficult it is for tech companies in house to be able to discern disinformation that might be emerging. Whether it’s a question of capacity or expertise, what’s critical is that government steps in to establish a body of credible experts and journalists from across the political spectrum to help us vet and rate different sources. That’s critical because that rating system can influence the internal workings of the systems the platforms rely upon, powered by algorithms. As of this point. that which is chosen to go more viral by the algorithms, is determined by maximum user engagement, rather than the credibility or rating of the source. Currently, algorithms will choose the more radicalized version of something to promote, and not to represent a diversity of viewpoints and perspectives that might provide balance.

We need to figure out ways for algorithmic systems to have more collaborative design, so they don’t come out racist, misogynistic and discriminatory. Is it the intention of a tech company to be racist? Of course not, but this is what happens when you think you can do everything in house, when executives and engineers do not represent the demographics of society.

Government paid for the development of the Internet, and now a small number of companies reap its profits. What does a “people’s internet? look like?

A people’s internet is responsive to the values of people, where we the users of technology, as workers, as citizens, as members of various communities, have significant voice in impacting the digital experiences we have. A people’s internet is where we are thoughtful and careful of the widening and massive economic gulfs we see all around us. A digital world moving forward can be transformed to:

  • bring value to all;
  • engage us in the image of a true grassroots public sphere, where we can have differences of opinions in an open way, avoid the echo chambers and filter bubbles of social media that are influenced by confirmation bias, and the sensationalist, attention-manipulating algorithms;
  • has a democratic vision, where technologies are designed to secure people’s economic interests, not just those of the companies.
  • What does a technology company of the future — one that represents people — look like?

    We need to think about economic proposals independent of technology that can ensure that the revolution we are witnessing can make everyone economically prosper... so that overall everyone’s standard of living increases. Right now, the younger generations are making less than their parents, and life expectancy is decreasing in this country. Both of these must inspire us to consider proposals like universal basic income or gig economies that can come together to create benefits for workers.

    Companies have relied on public funding, so perhaps the public should have a percentage of equity, of the profitability of these companies. For example, what would happen if Uber drivers had 10% of equity in that company? Or if the entire platform was owned by its workers, like a cooperative? These are all ideas to play with.

    What is a “digital bill of rights?- Is it realistic that a Biden administration and a congress controlled by the Democrats could pass such a bill?

    The future of technology is a bipartisan concern. The current administration has an investigation under way with the department of justice and the Federal Trade Commissiion, 48 out of 50 attorneys general are looking at antitrust — there is large bipartisan support. The philosophy of “move fast and break things? is a playful model in Silicon Valley that works in a constricted engineering mindset. But we are talking about the minds, lives, psychological experiences, political behaviors and economic opportunities of the entire world. If you engineer for society without understanding society, you end up practicing social engineering. And in that scenario — like during the pandemic where we are more technologically dependent than ever before — the vulnerable and marginalized are likely to be the most harmed.

    We can embark on this path by asking some simple but powerful questions, for example:

    • When we use technology, are we users, or are we being used?
    • Are we Googling or are we being Googled?
    • Are we socializing, or are we being socialized?

    We have an opportunity to take proactive steps to have inclusive benefits and to protect our rights as workers, citizens and people. I truly believe we are at an inflection point where we can take these steps.

    Get top research & news headlines four days a week.

    (Check your inbox or spam filter for confirmation.)

    All RSS Feeds

    This site uses cookies and analysis tools to improve the usability of the site. More information. |