By Tyler Jang. November 18, 2018.
Trust is rapidly becoming one of the most valuable assets in today’s digital world. It shapes financial transactions, underlying the ebbs and flows of both the stock market and Amazon customer reviews; it impacts what route you will take to get to work, incorporating traffic reports and user data; and it dictates what videos you are willing to watch on YouTube. Last Monday, Jeffrey Ritter, an external lecturer at the University of Oxford and a Duke Law alumnus, spoke to the Duke University Program in American Grand Strategy and outlined his model for trust and its implications for cyber. At the core of Ritter’s analysis are several fundamental principles. He outlines them as such:
“Trust is not an emotion—trust decisions are rules-based.” Ritter has spearheaded the recent movement to redefine trust as a calculation, rather than a feeling. We evaluate data, whether sensory or secondary, against sets of rules to determine whether we will trust information. This model seeks to calculate a value for trust, and it coincides with researchers’ work to quantify trust between different systems, offering such applications as the interactions between self-driving cars moving independently on a road.
“Trust decisions are fueled by trusted information.” Whether from our own sensory input or from a secondhand source, we take in data to develop new trust evaluations. The proportion of these data that is digital has been steadily increasing, and as systems make more decisions automatically, it is essential that we evaluate data inputs accordingly.
“To trust digital information, you must first trust its source.” The acquisition of credible sources is critical to the integrity of academics and casual cyber citizens alike. In this manner, trust develops a recursive quality, reinforcing other sources of information. When incorrect data is trusted and incorporated into a model, error propagates. In order to responsibly maintain a rigorous evaluation of trust, one must place more weight on rigorously vetted sources.
“Trust is a rules-based calculation, fueled by information acquired from trusted sources.” This statement encapsulates the entirety of Ritter’s argument and its widespread implications.
Lastly, Ritter posits his Velocity Principle: “The velocity of information is proportional to the transparency of its governance.” When the path and manipulation of digital information is made clear to the user, trust is calculated much more efficiently, allowing for the further spread of the relevant information. The converse is also true in systems where information is neither freely available nor easily scrutinized: the flow of information grinds to a halt, and the public is left uninformed. At this juncture, the age-old debate resumes, as competing desires for transparency and privacy dominate both the market and public discourse.
Today, it is common to operate with an assumption of trust in digital systems. Ritter urges us to consider these implications in the case of a hypothetical crime. In a car accident, for example, a collision was witnessed by two pedestrians as well as by two CCTV cameras at the intersection. In a court of law, the cameras and the cars’ black boxes take precedence over the witnesses’ accounts. This is understandable, but the digital accounts are not infallible. Fake videos and deep fakes are becoming frighteningly more convincing and prevalent. It is essential, in this world of manipulation and untrustworthy information, that we reevaluate our trust in digital systems and the data they produce, rather than blindly accepting their integrity.
Fortunately, Ritter notes, recent media coverage of fake news scandals, manipulated video, and even potentially discriminatory algorithms has brought digital trust into the public spotlight, forcing us to question our faith in digital information. Our reliance on digital media will only continue to grow, even as its fidelity is left uncertain. Thus, Ritter concludes, in order to take a more active role in our continual consumption of information, it is imperative that we incorporate a more rigorous and intentional evaluation of trust into our digital lives.
Tyler Jang is a first-year majoring in electrical and computer engineering and a member of the Duke Cyber Team.