Thursday, January 31, 2019

Zucked -- Roger McNamee's Wake Up Call ...And Beyond

Zucked: Waking Up to the Facebook Catastrophe is an authoritative and frightening call to arms -- but I was disappointed that author Roger McNamee did not address some of the suggestions for remedies that I shared with him last June (posted as An Open Letter to Influencers Concerned About Facebook and Other Platforms).

Here are brief comments on this excellent book, and highlights of what I would add. Many recognize the problem with the advertising-based business model, but few seem to be serious about finding creative ways to solve it. It is not yet proven that my suggestions will work quite as I envision, but the deeper need is to get people thinking about finding and testing more win-win solutions. His book makes a powerful case for why this is urgently needed.

McNamee's urgent call to action

McNamee offers the perspective of a powerful Facebook and industry insider. A prominent tech VC, he was an early investor and mentor to Zuckerberg -- the advisor who suggested that he not sell to Yahoo, and who introduced him to Sandberg. He was alarmed in early-mid 2016 by early evidence of manipulation affecting the UK and US elections, but found that Zuckerberg and Sandberg were unwilling to recognize and act on his concerns. As he became more concerned, he joined with others to raise awareness of this issue and work to bring about needed change.

He provides a rich summary of how we got here, most of the issues we now face, and the many prominent voices for remedial action. He addresses the business issues and the broader questions of governance, democracy, and public policy. He tells us: “A dystopian technology future overran our lives before we were ready.” (As also quoted in the sharply favorable NY Times review.)

It's the business model, stupid!

McNamee adds his authoritative voice to the many observers who have concluded that the business model that serves advertisers to enable consumers to obtain "free" services distorts incentives, causing businesses to optimize for advertisers, not for users:
Without a change in incentives, we should expect the platforms to introduce new technologies that enhance their already-pervasive surveillance capabilities...the financial incentives of advertising business models guarantee that persuasion will always be the default goal of every design."
He goes on to suggest:
The most effective path would be for users to force change. Users have leverage...
The second path is government intervention. Normally I would approach regulation with extreme reluctance, but the ongoing damage to democracy, public health, privacy, and competition justifies extraordinary measures. The first step would be to address the design and bushiness model failures that make internet platforms vulnerable to exploitation. ...Facebook and Google have failed at self-regulation.
My suggestions on the business model, and related regulatory action

This is where I have novel suggestions -- outlined on my FairPayZone blog, and communicated to McNamee last June -- that have not gotten wide attention, and are ignored in Zucked. These are at two levels.

The auto emissions regulatory strategy. This is a simple, proven regulatory approach for forcing Facebook (and similar platforms) to shift from advertising-based revenue to user-based revenue. That would fundamentally shift incentives from user manipulation to user value.

If Facebook or other consumer platforms fail to move to do that voluntarily, this simple regulatory strategy could force that -- in a market-driven way. The government could simply mandate that X% of their revenue must come from their users -- with a timetable for gradually increasing X.  This is how auto emissions mandates work -- don't mandate how to fix things, just mandate a measurable result, and let the business figure out how best to achieve that. Since reverse-metered ads (with a specific credit against user fees) would count as a form of reader revenue, that would provide an immediate incentive for Facebook to provide such compensation -- and to begin developing other forms of user revenue. This strategy is outlined in Privacy AND Innovation ...NOT Oligopoly -- A Market Solution to a Market Problem.

The deeper shift to user revenue models. Creative strategies can enable Facebook (and other businesses) to shift from advertising revenue to become substantially user-funded. Zuckerberg has
thrown up his hands at finding a better way: "I don’t think the ad model is going to go away, because I think fundamentally, it’s important to have a service like this that everyone in the world can use, and the only way to do that is to have it be very cheap or free."

Who Should Pay the Piper for Facebook? (& the rest), explains this new business model architecture -- with a focus on how it can be applied to let Facebook be "cheap or free" for those who get limited value and have limited ability to pay, but still be paid for, at fair levels for those who get more value and who are willing and able to pay for that. This architecture, called FairPay, has gained recognition for operationalizing a solution that businesses can begin to apply now.
  • A reverse meter for ads and data. This FairPay architecture still allows for advertising to continue to defray the cost of service, but on a more selective, opt-in basis --  by applying a "reverse meter" that credits the value of user attention and data against each user's service fees -- at agreed upon terms and rates. That shifts the game from the advertiser being the customer of the platform, to to the advertiser being the customer of the user (facilitated by the platform). In that way advertising is carried only if done in a manner that is acceptable to the user. That aligns the incentives of the user, the advertiser, and the platform. Others have proposed similar directions, but I take it farther, in ways that Facebook could act on now.
  • A consumer-value-first model for user-revenue. Reverse metering is a good starting place for re-aligning incentives, but Facebook can go much deeper, to transform how its business operates.The simplest introduction to the transformative twist of the FairPay strategy is in my Techonomy article, Information Wants to be Free; Consumers May Want to Pay   (It has also been outlined in in Harvard Business Review, and more recently in the Journal of Revenue and Pricing Management.) The details will depend on context, and will need testing to fully develop and refine over time, but the principles are clear and well supported.

    This involves ways to mass-customize pricing of Facebook, to be "cheap or free" where appropriate, and to set customized fair prices for each user who obtain real value and can be enticed to pay for that. That is adaptive to individual usage and value-- and eliminates the risk of having to pay when the value actually obtained did not warrant that. That aligns incentives for transparency, trust, and co-creation of real value for each user. Behavioral economics has shown that people are willing to pay and will do so even voluntarily -- when they see good reason to help sustain the creation of value that they actually want and receive. We just need business models that understand and build on that.
Bottom line. Whatever the details, unless the Facebook shifts direction on its own to aggressively move in the direction of user payments -- which now seems unlikely -- regulatory pressure will be needed to force that (just as with auto emissions). A user revolt might force similar changes as well, but the problem is far too urgent to wait and see.

The broader call -- augmenting the wisdom of crowds

Shifting to a user-revenue-based business model will change incentives and drive significant progress to remedy many of the problems that McNamee and many others have raised. McNamee provides a wide-ranging overview of many of those problems and most of the initiatives that promise to help resolve them, but there, too, I offer suggestions that have not gained attention.

Most fundamental is the power of social media platforms to shape collective intelligence. Many have come to see that, while technology has great power to augment human intelligence, applied badly, it can have the opposite effect of making us more stupid. We need to steer hard for a more positive direction, now that we see how dangerous it is to take good results for granted, and how easily things can go bad. McNamee observes that "We...need to address these problems the old fashioned way, by talking to one another and finding common ground." Effective social media design can help us do that.

Another body of my work relates to how to design social media feeds and filtering algorithms to do just that, as explained in The Augmented Wisdom of Crowds:  Rate the Raters and Weight the Ratings:
  • The core issue is one of trust and authority -- it is hard to get consistent agreement in any broad population on who should be trusted or taken as an authority, no matter what their established credentials or reputation. Who decides what is fake news? What I suggested is that this is the same problem that has been made manageable by getting smarter about the wisdom of crowds -- much as Google's PageRank algorithm beat out Yahoo and AltaVista at making search engines effective at finding content that is relevant and useful.

    As explained further in that post, the essence of the method is to "rate the raters" -- and to weight those ratings accordingly. Working at Web scale, no rater's authority can be relied on without drawing on the judgement of the crowd. Furthermore, simple equal voting does not fully reflect the wisdom of the crowd -- there is deeper wisdom about those votes to be drawn from the crowd.

    Some of the crowd are more equal than others. Deciding who is more equal, and whose vote should be weighted more heavily can be determined by how people rate the raters -- and how those raters are rated -- and so on. Those ratings are not universal, but depend on the context: the domain and the community -- and the current intent or task of the user. Each of us wants to see what is most relevant, useful, appealing, or eye-opening -- for us -- and perhaps with different balances at different times. Computer intelligence can distill those recursive, context-dependent ratings, to augment human wisdom.
  • A major complicating issue is that of biased assimilation. The perverse truth seems to be that "balanced information may actually inflame extreme views." This is all too clear in the mirror worlds of pro-Trump and anti-Trump factions and their media favorites like Fox, CNN, and MSNBC. Each side thinks the other is unhinged or even evil, and layers a vicious cycle of distrust around anything they say. It seems one of the few promising counters to this vicious cycle is what Cass Sunstein referred to as surprising validators: people one usually gives credence to, but who suggest one's view on a particular issue might be wrong. An example of a surprising validator was the "Confession of an Anti-GMO Activist." This item is  readily identifiable as a "turncoat" opinion that might be influential for many, but smart algorithms can find similar items that are more subtle, and tied to less prominent people who may be known and respected by a particular user. There is an opportunity for electronic media services to exploit this insight that "what matters most may be not what is said, but who, exactly, is saying it."
If and when Facebook and other platforms really care about delivering value to their users (and our larger society), they will develop this kind of ability to augment the wisdom of the crowd. (Similar large-scale ranking technology is already proven in uses for advertising and Google search.) Our enlightened, democratic civilization will disintegrate or thrive, depending on whether they do that.

The facts of the facts. One important principle which I think McNamee misunderstands (as do many), is his critique that "To Facebook, facts are not absolute; they are a choice to be left initially to users and their friends but then magnified by algorithms to promote engagement." Yes, the problem is that the drive for engagement distorts our drive for the facts -- but the problem is not that "To Facebook, facts are not absolute." As I explain in The Tao of Fake News, facts are not absolute --we cannot rely on expert authorities to define absolute truth -- human knowledge emerges from an adaptive process of collective truth-seeking by successive approximation and the application of collective wisdom. It is always contingent on that, not absolute. That is how scholarship and science and democratic government work, that is what the psychology of cognition and knowledge demonstrates, and that is what effective social media can help all of us do better.

Other monopoly platform excesses - openness and interoperability

McNamee provides a good survey of many of the problems of monopoly (or oligopoly) power in the platforms, and some of the regulatory and antitrust remedies that are needed to restore the transparency, openness, and flexibility and market-driven incentives needed for healthy innovation. These include user ownership of their data and metadata, portability of the users' social graphs to promote competition, and audits and transparency of algorithms.

I have addressed similar issues, and go beyond McNamee's suggestions to emphasize the need for openness and interoperability of competing and complementary services -- see Architecting Our Platforms to Better Serve Us -- Augmenting and Modularizing the Algorithm. This draws on my early career experience watching antitrust regulatory actions relating to AT&T (in the Bell System days), IBM (in the mainframe era), and Microsoft (in the early Internet browser wars).

The wake up call

There are many prominent voices shouting wake up calls. See the partial list at the bottom of An Open Letter to Influencers Concerned About Facebook and Other Platforms, and MacNamee's Bibliographic Essay at the end of Zucked (excellent, except for the omission that I address here).

All are pointing in much the same direction. We all need to do what we can to focus the powers that be -- and the general population -- to understand and address this problem. The time to turn this rudderless ship around is dangerously short, and effective action to set a better direction and steer for it has barely begun. We have already sailed blithely into killer icebergs, and many more are ahead.

---
This is cross-posted from both of my blogs, FairPayZone.com and Reisman on User-Centered Media, which delve further into these issues.


------------------------
More about FairPay

For a full introduction to FairPay see the Overview and the sidebar on How FairPay Works (just to the right, if reading this at FairPayZone.com). There is also a guide to More Details (including links to a video). 

My article in the Journal of Revenue and Pricing Management, "A Novel Architecture to Monetize Digital Offerings" also provides an overview of FairPay (summarized more briefly in the ESADE Knowledge article "Three building blocks to monetize a digital business," and previously in Harvard Business Review, "When Selling Digital Content, Let the Customer Set the Price.").

Even better, read my highly praised book: FairPay: Adaptively Win-Win Customer Relationships.

(FairPay is an open architecture, in the public domain. My work on FairPay is pro-bono. I offer free consultation to those interested in applying FairPay, and welcome questions.)

Thursday, January 3, 2019

2019 New Year's Resolution: Let's Work Together to Invent a Better 2020!

My forecast for 2019: The best way to predict the future is to invent it -- let's work together on inventing a better 2020!

We face two over-arching and related challenges, one in the world of technology, and one in the larger world of enlightened democratic society.

At the broadest level, 2019 promises to be perhaps the worst and most traumatic year in recent American history. My point is not one of politics or policy (I bite my tongue), but of our basic processes of democratic society -- how we all work together to understand the world and make decisions. We now see all to well how much harm technology has done to that -- not by itself, but as an amplifier of the worst in us.

Within that world of technology, many have come to realize that we have taken a wrong turn in building vast and deeply influential infrastructures that are sustained by advertising. That perverts the profit incentive from creating value for we the people, to exploiting us to profit advertisers. That drive for engagement and targeting inherently conflicts with the creation of real value for users and society. We seem to not even be looking very hard for any solution beyond band-aids that barely alter 1) the perverse incentives of advertising, and 2) the failing zero-sum economics of artificial scarcity.

We seem to be at a loss for how to solve these problems at either level. I suggest that is simply a failure of will, imagination, and experimentation that we can all help rectify. Many prominent thought leaders have said much the same. I list some of them, and offer creative suggestions in An Open Letter to Influencers Concerned About Facebook and Other Platforms. I hope you will read it, as well as the related material it links to.

My suggestions are more specific, actionable, and practical. That letter summarizes and links to ideas I have been developing for many years, but have taken on new urgency. They are well-supported, but as yet unproven in their full form. I can't be sure that my solutions will work, but there seems to be growing consensus that the problems are real, even if few have suggested any actionable path to solving them. (I have been a successful inventor and futurist for many decades. I have often been wrong about details of my answers, but have rarely have been far wrong about problems and issues. Very smart and well-informed people think I am on the right track here.)

But whether or not I am right about the solutions, we all have to make it a priority try to find, test, and refine the best solutions we can to confront these critical problems.

Still, few in technology, business, or government have turned from business as usual to rise to the urgent challenges we now face -- and even those who alert us to these problems seem to have few concrete strategies for effective action.

Please consider the urgency and importance of these issues at both levels, see if my suggestions or those of others resonate -- and add your voice in those directions -- or work to suggest better directions.

If we do not begin to make real progress in 2019, we may face a very dark 2020 and beyond.

If we do begin to turn this ship around, we can recharge the great promise of technology to augment our intellect and our society, and to create a new economics of abundance.

---
This is cross posted from both of my blogs, FairPayZone.com and Reisman on User-Centered Media, which delve further into these issues.