The New Investor Cliffhanger

There is much to like about Professor Tom Lin’s article The New Investor1 so I was delighted to accept the UCLA Law Review’s request that I comment thereon.  Lin’s article provides a comprehensive survey of how technology is changing the capital markets and thus, inevitably, presenting new challenges for securities regulation.  Lin writes well and clearly, even about complex legal and technological issues.  The article is exhaustively researched, reflecting a command of literatures from a number of disciplines.  As a lifelong tech geek and science fiction fan, the many explicit and implicit references to classic science fiction tropes tickled my fancy.

Ultimately, however, I came away from the article feeling frustrated.  As I thought about why I felt that way, it occurred to me that I felt the same way when I finished George R.R. Martin’s book A Dance With Dragons.  When a 1040 page book that is the fifth in a series ends on a cliffhanger, one is—I think—entitled to be a tad peeved.  Although Professor Lin’s article is not quite so lengthy as A Dance With Dragons, by my count it leaves the reader with not just one but eleven cliffhangers.

Lin’s article thus called to mind my former University of Illinois College of Law colleague John Nowak’s advice that the way to become a successful professor is to “[t]ake an obscure little problem that no one has thought much about, blow it out of all proportion, and solve it, preferably several times, in prestigious law reviews.”2  Unfortunately, other than publishing in a prestigious law review, that is not advice Lin chose to follow in The New Investor.

Let us start with Nowak’s recommendation that one select “an obscure little problem.”  Lin violated all three of the precepts contained therein.  Instead of one problem, he chose many.  Instead of an obscure problem, he chose very important ones.  Instead of a little problem, he chose very big ones.

The issues that Professor Lin raises in his article include:

(1)   We are all cyborgs now.3  This well-established social trope strikes me as the wrong metaphor.  When we think of cyborgs, we think of organisms that are part man and part machine, such as the Borg from Star Trek or Darth Vader from Star Wars.4  The claim that we are all cyborgs seems demonstrably false when applied to investing.  As of 2009, retail investors directly owned only 38 percent of U.S. equities.5  Increasingly, ordinary investors participate in the capital markets indirectly through pension and mutual funds.6  They simply do not avail themselves of the sort of high technology trading with which Lin is concerned.

(2)   The human–supercomputer relationship is fraught with hazards.7  This is true but hardly novel.  To the contrary, this trope has been a staple of science fiction for decades.8  Before there was The Matrix9 there was Skynet.10  Before there was Skynet there was Colossus.11  And so on back to The Machines.12

(3)   Instead of the dystopian future associated by many with the rise of the machines, everything in fact will turn out okay.13  This is a classic inverted trope,14 which flips around the more common “bad future” trope.15

(4)   Computers are really important.  We all know law review articles are too long.  As Judge Posner put it, “[t]he result of the system of scholarly publication in law is that too many articles are too long, too dull, and too heavily annotated.”16  Mea culpa.  But I am afraid that Professor Lin has fallen into the same trap.  Consider, for example, Part I.A of the article, in which Lin traces the rise of the machines in society.17  These four pages could easily have been boiled down to the single sentence with which this numbered paragraph began.

(5)   Computers are really, really important.  In addition to being too long, law review articles often suffer from grandiose ambition, which detracts from their utility for end users such as practitioners, judges, legislators, and regulators.18  Part V of Lin’s article is a case in point.  Lin takes on the question of whether technological advances “necessitate the fall of humans in society and finance,” concluding that they do not.19  This is the question asked in the countless works exploring the “computer is your friend” trope.20  I have no doubt that it is also a question worthy of serious philosophical musing, but I am not convinced that an article on regulating the impact of technology on capital markets is the right venue for it.

(6)   Algorithmic and high-frequency trading have the potential to make markets more efficient, but they also can produce market failures, such as the infamous flash crash of May 2010.21  Of course, this is a huge problem urgently needing attention.22

(7)   The computerization of finance requires us to rethink the legal concept of the reasonable investor.23

(8)   Capital markets are subject to the same sort of cyber crimes, warfare, and sabotage as other aspects of our networked world.24

(9)   Hyperspeed trading in dark pools and other electronic markets occurs too fast for regulators to intervene to protect markets and investors.25  Again, this is a huge problem urgently needing attention.26

(10)The computerization of capital markets has been accompanied by the development of vast interlocked networks, which Lin asserts are “too linked to fail.”27

(11)Regulation is unable to keep up with technological change.28  This is a specific application of a well-understood problem.  In the context of climate change, for example, Jonathan Adler observes that “[e]ven if regulators were able to identify a proper target initially, the regulatory process changes so slowly that regulatory standards would be unlikely to keep up with technological change or account for new information.”29  In the case of securities regulation, Robert Ahdieh has noted that, “in a time of rapid technological change and transition more generally,” one should not place too much confidence in “any SEC prediction of optimally efficient market structures.”30  Indeed, Jon Macey has gone so far as to argue that “exogenous technological changes . . . have obviated any public interest justification for the SEC that may have existed.”31

In sum, Professor Lin flags a substantial number of significant problems caused by the wide-ranging impact of technology on capital markets and securities regulation.  Unfortunately, he has not yet solved any of them.

Consider, for example, Lin’s treatment of flash crashes.  His discussion ends by noting:

While no other major crash has occurred since the Flash Crash, experts and regulators fear that it is only a matter of time before the “Big One.”  And in the interim, smaller market disruptions have grown and will likely continue to grow more prevalent as cy-fi advances and proliferates.32

Yet, there is no discussion of how regulators might respond.

The discussion of how cyber crimes threaten capital markets concludes with the observation that “[c]ybersecurity prevention and protection efforts are undoubtedly difficult, but they must also be sensible, thoughtful, and not obstruct the promise and progress of cyborg finance.”33  Lin therefore urges that the endeavor to balance these competing concerns “must be pursued vigorously because, ultimately, technological advances in finance may hold more promise than threat in the future.”34  The reader is left wondering: (1) Assuming all turns out well, how do we effect such a balance, and (2) what happens if the future of computerized finance looks more like Skynet than HOLMES IV?35

The section on how computer networks have created the linked problems that Lin calls “too fast to save” and “too linked to fail” ends by opining that “[h]arnessing the power of cy-fi’s speed and linkage while managing its risks will be a critical challenge for financial regulators in the coming years.”36  How regulators might approach—let alone resolve—those challenges is a question to which Lin provides no answer.  Likewise, the question of how the SEC and other financial regulators should deal with technological change is left unanswered.  We are told that “[l]aw needs to better situate itself at the intersection of technology and finance in order to remain relevant and effective.”37  But we are not told how law can do so.

Lin’s failure to provide solutions is particularly frustrating when one comes to his discussion of the implications technological change has for the legal definition of the “reasonable investor.”  Here was a “problem that no one has thought much about,” albeit not a little one, crying out for a solution.  I left Lin’s analysis fully persuaded that technological changes and new insights from behavioral economics require us to rethink what it means to be a reasonable investor.  But I also left Lin’s analysis with no idea of how courts might go about reframing the existing legal definition to take cognizance of those developments.  Telling the reader that “regulators need to become more mindful of the dynamism and realism of the new investor model if they hope to remain relevant” is not a solution.38

What makes Lin’s failure to offer a solution to this problem even more frustrating is that this is one context in which his interest in broader social questions may well prove relevant.  In United States v. Sayre,39 the Ninth Circuit held that a district court had not abused its discretion by refusing to instruct the jury in a securities fraud case as to the meaning of “reasonable investor.”40  The appellate court explained that “[t]he term ‘reasonable investor’ is a concept within the jury’s ordinary experience and understanding.”41  But if Lin is right about technology changing what it means to be a reasonable investor, and I am right that most investors are not cyborgs, will it remain the case that the everyday experiences of ordinary jurors will equip them to decide questions of materiality?  Put another way, will Wall Street’s Skynets be able to find a jury of their peers?

* * *

In sum, if one wants a broad survey of technology and capital markets from which one could mine multiple problems one could proceed to solve, “preferably several times, in prestigious law reviews,” Lin’s article will serve the purpose admirably.  Indeed, in closing, I urge Lin himself to do so.  His article demonstrates a command of several relevant literatures that will give him a launching platform from which to move to detailed analyses of the problems he has identified.  Just skip the cliffhangers.



  1. Tom C.W. Lin, The New Investor, 60 UCLA L. Rev. 678 (2013).
  2. John E. Nowak, Woe Unto You, Law Reviews!, 27 Ariz. L. Rev. 317, 320 (1985) (internal quotation marks omitted).
  3. See Lin, supra note 1, at 703 (opining that “we are all cyborgs”).
  4. See Cyborg, Wikipedia, http://en.wikipedia.org/wiki/Cyborg (last visited Apr. 24, 2013) (“A cyborg, short for ‘cybernetic organism,’ is a being with both organic and cybernetic parts.”).
  5. Jill E. Fisch, Securities Intermediaries and the Separation of Ownership From Control, 33 Seattle U. L. Rev. 877, 879 (2010).
  6. See John C. Bogle, Reflections on “Toward Common Sense and Common Ground?, 33 J. Corp. L. 31, 31 (2007) (stating that financial intermediaries held approximately 74 percent of the stock of U.S. corporations).
  7. See Lin, supra note 1, at 680 (“The end is near for the human investor.  Computers have changed everything.” (footnote omitted)).
  8. See The Computer Is Your Friend, TvTropes.org, http://tvtropes.org/pmwiki/pmwiki.php/Main/TheComputerIsYourFriend (last visited Apr. 24, 2013) (discussing the classic trope in which “artificially-intelligent computer programs . . . rule mankind with an iron fist (literally)”).
  9. See The Matrix, Wikipedia, http://en.wikipedia.org/wiki/The_Matrix (last visited Apr. 24, 2013) (describing the Matrix computer simulation in the Wachowski Brothers film of the same name).
  10. See Skynet (Terminator), Wikipedia, http://en.wikipedia.org/wiki/Skynet_(Terminator) (last visited Apr. 24, 2013) (describing the fictional artificial intelligence system that is the principal villain in the Terminator film series).
  11. See Colossus: The Forbin Project, Wikipedia, http://en.wikipedia.org/wiki/Colossus:_ The_Forbin_Project (last visited Apr. 24, 2013) (describing the advanced supercomputer that assumes world control in a Joseph Sargent film based on a Dennis Jones novel).
  12. See The Evitable Conflict, Wikipedia, http://en.wikipedia.org/wiki/The_Evitable_Conflict (last visited Apr. 24, 2013) (describing an Isaac Asimov short story in which powerful positronic computers attempt to take control of humanity).
  13. See Lin, supra note 1, at 732 (“Human ingenuity in persuasion, culture, spirit, and emotion—in the matters that are difficult to capture with data but nonetheless important—are all key ingredients that must be accounted for in any successful enterprise, financial or otherwise.”).
  14. See Inverted Trope, TvTropes.org, http://tvtropes.org/pmwiki/pmwiki.php/Main/InvertedTrope (last visited Apr. 24, 2013) (defining the inverted trope as “[a] particular form of creatively reusing an existing trope: The trope is turned exactly on its head”).
  15. See Bad Future, TvTropes.org, http://tvtropes.org/pmwiki/pmwiki.php/Main/BadFuture (last visited Apr. 24, 2013) (“Good Future variations are practically unknown, since knowing that the future is going to turn out okay removes any dire need for the characters to change things in the present.”).
  16. Richard A. Posner, Against the Law Reviews, Legal Aff., Nov./Dec. 2004, http://www.legalaffairs.org/issues/November-December-2004/review_posner_novdec04.html.
  17. Lin, supra note 1, at 684–87.
  18. See James W. Ely, Jr., Through a Crystal Ball: Legal Education—Its Relation to the Bench, Bar, and University Community, 21 Tulsa L.J. 650, 654 (1986) (“Law Reviews are filled with . . . grandiose reflections about political philosophy, legal history, and social order, topics with scant interest for busy practitioners.” (footnote omitted)).
  19. Lin, supra note 1, at 727.
  20. See supra note 8 (describing that trope).
  21. Lin, supra note 1, at 703–06.
  22. See, e.g., Austin J. Sandler, The Invisible Power of Machines: Revisiting the Proposed Flash Order Ban in the Wake of the Flash Crash, 2011 Duke L. & Tech. Rev. 39 (discussing how technological innovations such as high-frequency trading can make markets more efficient but also can produce serious regulatory concerns).
  23. Lin, supra note 1, at 693–703.
  24. Id. at 706–10.
  25. See id. at 711–14 (discussing what Lin calls the problem of trading that is “too fast to save”).
  26. See, e.g., Matt Prewitt, Note, High-Frequency Trading: Should Regulators Do More?, 19 Mich. Telecomm. & Tech. L. Rev. 131 (2012) (explaining how high-frequency trading presents difficult regulatory problems).
  27. Lin, supra note 1, at 714–16.
  28. Id. at 717–22.
  29. Jonathan H. Adler, Eyes on a Climate Prize: Rewarding Energy Innovation to Achieve Climate Stabilization, 42 Envtl. L. Rep. News & Analysis 10,713, 10,716 (2012).
  30. Robert B. Ahdieh, Law’s Signal: A Cueing Theory of Law in Market Transition, 77 S. Cal. L. Rev. 215, 283 (2004).
  31. Jonathan R. Macey, Administrative Agency Obsolescence and Interest Group Formation: A Case Study of the SEC at Sixty, 15 Cardozo L. Rev. 909, 949 (1994).
  32. Lin, supra note 1, at 706 (footnote omitted).
  33. Id. at 710 (footnote omitted).
  34. Id.
  35. HOLMES IV is the supercomputer that aids the lunar rebellion in Robert Heinlein’s novel The Moon Is a Harsh MistressSee The Moon Is a Harsh Mistress, Wikipedia, http://en.wikipedia.org/wiki/The_Moon_Is_a_Harsh_Mistress (last visited Apr. 24, 2013) (describing the novel’s plot).
  36. Lin, supra note 1, at 716.
  37. Id. at 721–22.
  38. Id. at 703.
  39. 434 F. App’x 622 (9th Cir. 2011).
  40. Id. at 624.
  41. Id.
By uclalaw
/* ]]> */