Software
Patents
Medical and Biotechnology Patents
Patent Theory
Antitrust Principles in
Information Technology
Antitrust and Industrial
Organization
Cyberlaw
Quantitative Legal Analysis
Administrative Law
Information Technology in the
Legal System
Asian American Studies
Parallel Algorithms
and Architectures
Boolean Function Complexity
|
Software Patents |
Ghost in the New Machine: How Alice Exposed Software
Patenting's Category Mistake, 16 N.C. J. L. & Tech. 623 (2015)
The Alice Court's characterization of computer programming
has effectively repudiated, inter alia, the doctrine that
programming a general-purpose computer creates a patent-eligible
"new machine." This Article revisits In re Bernhart,
the first holding based on the "new machine" principle,
concluding that the Court of Customs and Patent Appeals committed a
category mistake in conducting its nonobviousness analysis. This
suggests that § 101 has a unique role to play in ensuring the
analytical coherence of the other tests for patentability, and that
step two of the Mayo/Alice test could helpfully enforce the
doctrinal distinction between a patent-eligible "method or
means" and an unpatentable "result or effect."
Alappat Redux: Support
for Functional Language in Software Patent Claims, 66 SMU L. Rev.
491 (2013)
The Federal Circuit has suggested in some recent cases that any
algorithm can serve as adequate structural support for a
means-plus-function element in a software patent claim under §
112(f). A recent proposal by Mark Lemley fully endorses this
proposition and seeks its broader application. The concept of an
algorithm, however, is too slippery to serve as the basis for such a
rule. In this Article, I argue that this overreliance on the
algorithm concept originated in a revisionist gloss on the Federal
Circuit's 1994 Alappat decision. Informed by a closer reading
of what Alappat actually has to say about claim construction
under § 112(f), I propose a more stable "concrete
causation" standard that is applicable to all technologies, but
would be especially well-aligned with the reforms in the software
field intended by Lemley's proposal.
Computational Complexity and the Scope of Software Patents,
39 Jurimetrics 17 (1999)
This article proposes that the reverse doctrine of equivalents
should allow as a defense to software patent infringement those
improvements in computational complexity that are superlinear in the
parameters of the problem solved by the underlying algorithm, and
presents four independent rationales for such an approach.
On Abstraction and Equivalence in Software Patent Doctrine: A
Response to Bessen, Meurer and Klemens, 16 J. Intell. Prop. L. 197
(2009)
Recent books by Professors James Bessen and Michael Meurer and by
economist Ben Klemens have argued that software warrants
technology-specific treatment in patent doctrine. This article
argues that the authors' categorical claims about software are
unsupported by computer science, and therefore cannot support their
sweeping proposals regarding software patents as a matter of law.
Such proposals remain subject to empirical examination and
critique as policy choices, and are unlikely to be achieved through
judicially developed categorical distinctions. |
|
Medical and Biotechnology Patents |
Surgically Precise But Kinematically
Abstract Patents, 55 Hous. L. Rev. 268 (2017)
This article critically examines kinematically abstract claims in the U.S. surgical robotics industry, where claims purporting to cover all mechanisms exhibiting a specific kinematic property are widespread. First, it describes the role of patents and kinematic claiming in Intuitive Surgical’s emergence as the industry’s monopolist in 2003 and in some of the subsequent challenges the company has faced from competing innovators and patent owners. Second, it draws on results from physics and geometry to explain why kinematically abstract claims logically fall under longstanding doctrinal exclusions of mathematical theorems and
abstract ideas from patent-eligible subject matter. Finally, it examines the patent-eligibility of a claimed surgical manipulator whose design incorporates kinematic data captured from procedures performed by kinesthetically skilled surgeons. From this case study, broader questions emerge about the kinds of progress and skill that fall within the patent system’s ambit, with further consequences for the political economy
of labor and downstream innovation in the age of automation. Gene
Probes as Unpatentable Printed Matter, 20 Fed. Cir. B.J. 528
(2011)
This article argues that the printed matter doctrine is applicable
to DNA oligonucleotide molecules because they are disposed to
store nucleotide sequence information in a manner analogous in all
relevant respects to other substrates that may be more intuitively
recognizable as information storage media, such as laser-printed
text on paper.
Artful Prior Art and the Quality of DNA
Patents, 57 Ala. L. Rev. 975 (2006)
This article argues that a
focus on disclosed molecular structure in evaluating DNA patent
claims has resulted in a significant discrepancy between the prior
art that is available to the patent system and the scientific
community’s understanding of the state of the art. To illustrate and
address this problem, this article presents an example of an
“artfully drafted” prior art reference: a digital document [On the Preparation and Utilization of Isolated and
Purified Oligonucleotides, CD-ROM (2002)] that discloses the
sequences of 11 million oligonucleotides (short DNA molecules) and
methods of making and using each, which was derived from the
previous scientific literature without further inventive skill, and
has now been cited in more than 30 pending prosecutions.
Research in the Shadow of DNA Patents, 87 J. Pat. & Trademark
Off. Soc'y 846 (2005)
In recent years, the Federal Circuit and the Patent Office have
characterized the legal doctrines governing the patentability of DNA
molecules as essentially settled. This Article argues that the
factual premises underlying those doctrines are increasingly being
undermined by ongoing developments in biotechnology. Specifically,
it may soon be possible to demonstrate that the patenting of DNA
molecules retards the identification and sequencing of so many other
useful DNA molecules that patent-driven DNA research is a
self-defeating enterprise. To this end, this Article provides
quantitative evidence of the preclusive effects of DNA patenting on
specific laboratory procedures in genetic research. |
|
Patent Theory |
Surgically
Precise But Kinematically Abstract Patents, 55 Hous. L. Rev. 268
(2017)
This Article critically examines kinematically abstract claims in
the U.S. surgical robotics industry, where claims purporting to
cover all mechanisms exhibiting a specific kinematic property are
widespread. First, it describes the role of patents and kinematic
claiming in Intuitive Surgical’s emergence as the industry’s
monopolist in 2003 and in some of the subsequent challenges the
company has faced from competing innovators and patent owners.
Second, it draws on results from physics and geometry to explain
why kinematically abstract claims logically fall under
longstanding doctrinal exclusions of mathematical theorems and
abstract ideas from patent-eligible subject matter. Finally, it
examines the patent-eligibility of a claimed surgical manipulator
whose design incorporates kinematic data captured from procedures
performed by kinesthetically skilled surgeons. From this case
study, broader questions emerge about the kinds of progress and
skill that fall within the patent system’s ambit, with further
consequences for the political economy of labor and downstream
innovation in the age of automation.
The
Ontological Function of the Patent Document, 74 U. Pitt. L. Rev.
263 (2012)
With the passage and impending implementation of the “first-to-file”
provisions of the America Invents Act of 2011, the U.S. patent
system must rely more than ever before on patent documents for its
own ontological commitments concerning the existence of claimed
kinds of useful objects and processes. This Article provides a
comprehensive description of the previously unrecognized function
of the patent document in incurring and securing warrants to these
ontological commitments, and the respective roles of legal
doctrines and practices in the patent system’s ontological
project. Among other contributions, the resulting metaphysical
account serves to reconcile competing interpretations of the
written description requirement that have emerged from the Federal
Circuit’s recent jurisprudence, and to explain why the patent
system is willing and able to examine, grant and enforce claims
reciting theoretical entities. While this Article is entirely
descriptive, it concludes by identifying promising normative and
prescriptive implications of this work, including the formulation
of an appropriate test for the patent-eligibility of
software-implemented inventions in the post-Bilski era.
The Elusive "Marketplace" in Post-Bilski
Jurisprudence, 34 Campbell L. Rev. 663 (2012)
The Supreme Court's 2010 decision in Bilski v. Kappos
appears to have provided inadequate guidance to the courts and the
Patent Office regarding the scope of the abstract-ideas exclusion
from patentable subject matter. Federal Circuit Chief Judge
Randall R. Rader, however, appears to have found in that decision
a clear vindication of his own view that the
machine-or-transformation test is incorrectly grounded in
"the age of iron and steel at a time of subatomic particles
and terabytes," and thus fails, for example, to accommodate
advances in "software [that] transform[] our lives without
physical anchors." Chief Judge Rader has subsequently
authored a series of opinions identifying the
"marketplace" as an operational context in which a
claimed invention is not likely to be unpatentably abstract.
This article argues that this reliance on the
"marketplace" is untenable and should form no part of
patent-eligibility doctrine. |
|
Antitrust Principles in Information Technology |
Antitrust Analysis in Software Product
Markets: A First Principles Approach, 18 Harv. J. L. & Tech. 1
(2004)
This article argues that antitrust analysis in the
software industry should proceed from the understanding that
software products confer intellectual property rights and
technological capabilities incident to a copy of the vendor’s
software code, and are not comprised of the software code itself.
The article shows how to use this more precise understanding of a
software product in defining the relevant product market, a central
inquiry in antitrust analysis. These methods are illustrated with
discussions of the Syncsort and Grokster cases.Decoding Microsoft: A First Principles Approach, 40 Wake
Forest L. Rev. 1 (2005)
Applying the methods developed in “Antitrust Analysis in Software
Product Markets,” this article argues that the courts, the
litigating parties, and most commentators misconstrued the Microsoft
tying claim by relying on the inaccurate intuition that the
allegedly tied software products were comprised of software code.
The article reviews the complex litigation history of the tying
claim, pointing out where these errors occurred. The article then
reexamines the tying claim under each of the three proposed
alternative approaches (the Jefferson Parish, “facially plausible
benefits” and rule of reason standards), and concludes that the
factual record on remand to Judge Kollar-Kotelly would have
supported tying liability and that Microsoft now enjoys
illegitimately acquired monopoly power in the market for Web browser
software products.
Installed
Base Opportunism and the Scope of Intellectual Property Rights in
Software Products, 10 Wake Forest Intell. Prop. L.J. 323 (2010)
In contrast to the D.C. Circuit’s dismissal of Microsoft’s
copyright counterclaims, antitrust challenges to IBM’s current
mainframe licensing practices thus far have encountered broad
judicial deference to IBM’s patent rights. The purpose of this
Article is to analyze and critique these contrasting approaches
and to situate the current litigation and investigation involving
IBM in the still-unsettled doctrinal context at the intersection
of intellectual property and antitrust law.
A Case of Insecure Browsing: Exploring Missed Opportunities in
the Microsoft Antitrust Suit, Raleigh News & Observer, Sept. 30,
2004
As the deadline for certiorari passed, officially ending the
Microsoft litigation, this op-ed article argues that the failure to
redress Microsoft's tying conduct has resulted in serious security
hazards and wasted judicial resources. |
|
Antitrust and Industrial Organization |
An
Anti-Competitive Get Together, Raleigh News & Observer, Oct. 4,
2011 (with Barak Richman)
Determining how an acquisition or a merger will affect the larger
economy is routinely complicated and often invites vigorous debate.
But the proposed AT&T-T-Mobile deal presents an easy case.
Analyzing
Mergers in Innovation Markets, 38 Jurimetrics 119 (1998)
This article presents a probabilistic framework for the analysis of
mergers in innovation markets, and shows that this technique is
preferable to the current approach in four ways. First, it
more thoroughly accounts for the uncertainty inherent in definitions
of innovation markets and in allegations of anticompetitive effects
in innovation markets. Second, it more accurately measures the
structural effects of a merger on a market that faces the prospect
of technological change. Third, it separates the fact-specific
allegation that a merger will reduce the probability of successful
innovation from the controversial general proposition that mergers
hamper innovation. Finally, it construes innovation markets in
a way that clearly falls within the cognizance of Section 7 of the
Clayton Act.
Antitrust By Chance: A Unified Theory of Horizontal Merger
Doctrine, Note, 106 Yale L.J. 1165 (1997)
This note reinterprets the federal antitrust agencies’ Horizontal
Merger Guidelines as an internally consistent system of statistical
inference that accounts for the dynamic behavior of market
structure. |
|
Cyberlaw |
Differential Privacy as a Response to
the Reidentification Threat: The Facebook Advertiser Case Study,
90 N.C. L. Rev. ___ (2012) (with Anne Klinefelter)
This article uses a reverse-engineering approach to infer that
Facebook appears to be using differential privacy-supporting
technologies in its interactive query system to report audience
reach data to prospective users of its targeted advertising
system, without apparent loss of utility. This case study
provides an opportunity to consider criteria for identifying
contexts where privacy laws might draw benefits from the adoption
of a differential privacy standard similar to that apparently met
by Facebook's advertising audience reach database. United
States privacy law is a collection of many different sectoral
statutes and regulations, torts, and constitutional law, and some
areas are more amenable to incorporation of the differential
privacy standard than others. This Article highlights some
opportunities for recognition of the differential privacy standard
as a best practice or a presumption of compliance for privacy,
while acknowledging certain limitations on the transferability of
the Facebook example.
Opening Up Facebook's
Privacy Technology, Raleigh News & Observer, May 18, 2012
If the reverse-engineering study reported in Differential
Privacy is correct about the kind of privacy technology
Facebook is using, there is no reason for Facebook to keep its
details secret.
Making the World Wide Web Safe for
Democracy: A Medium-Specific First Amendment Analysis, 19 Hastings
Comm. & Ent. L.J. 309 (1997)
This article provides theoretical
and empirical analyses of the impact of the Web’s hyperlinked
architecture on the structure of democratic discourse, and argues
that the First Amendment does not foreclose redistributive,
content-neutral regulation of media power on the Internet.
|
|
Quantitative Legal Analysis |
The
Signature of Gerrymandering in Rucho v. Common Cause, 70
S.C. L. Rev. 1241 (2019) (with Jonathan Mattingly &
Gregory Herschlag)
In recent years, the U.S. mathematical community has been
directing unprecedented attention to the problem of partisan
gerrymandering, aided by computational advances and spurred by
litigation challenging the spate of extreme partisan redistricting
that followed the 2010 census. As North Carolina scholars who have
been involved in the landmark Rucho v. Common Cause
litigation, we have written this Article with the threefold aim of
explaining how the expert analysis of North Carolina's
congressional map was performed, how it was used to substantiate
the plaintiffs' claims at trial and on remand, and crucially, how
it may serve to address the justiciability concerns that have long
attended the Supreme Court's partisan gerrymandering jurisprudence
and have represented the legal context for our work.
The
Learned Hand Unformula for Short-Swing Liability, 91 Wash. L. Rev.
1523 (2016)
Section 16(b) of the Securities Exchange Act of 1934 allows for
the recovery of short-swing profits realized by certain insiders
from trading in a corporation’s stock within a period of less
than six months. Three generations of corporate law students have
been taught the “lowest-in, highest-out” formula that is
intended to maximize the disgorgement of short-swing profits under
section 16(b). Arnold Jacobs’s 1987 treatise presented two
hypothetical examples where the formula fell short of the intended
maximum, but courts, commentators, and practitioners have largely
ignored these theoretical challenges to the formula’s validity.
This Article identifies Gratz v. Claughton as the first
reported real-world example of the formula’s failure.
Ironically, Gratz has been taught and cited for more than
sixty years as a leading authority for the formula’s use, not
least because of its distinguished author, Judge Learned Hand.
This Article argues that Gratz has been misunderstood and
that Hand wisely adjudicated this complex case without prescribing
or endorsing the formula in any way. It also shows that the
formula has no need of Gratz’s endorsement, as long as
the formula is correctly interpreted as limited to simpler cases
where it is mathematically valid. It formalizes and extends Jacobs’s
results by showing that the formula may fall short of the maximum
by up to fifty percent when misused in more complex cases, and has
actually fallen short in another more recent case. Finally, it
provides online tools to enable practitioners and judges to
calculate short-swing liability correctly in all cases.
Accurate Calculation of Short-Swing
Profits Under Section 16(b) of the Securities Exchange Act of 1934,
22 Del. J. Corp. L 587 (1997)
This article provides a general
method for accurately calculating short-swing profits under § 16(b)
of the Securities Exchange Act of 1934, correcting the widely taught
but potentially erroneous “lowest-in, highest-out” algorithm.
|
|
Administrative Law |
Spoiling
the Surprise: Constraints Facing Random Regulatory Inspections in
Japan and the United States, 20 Nw. J. Int'l L. & Bus. 99 (1999)
This article examines the use of random administrative
inspections in the United States and Japan in the wake of the 1998
Japanese Ministry of Finance scandal. |
|
Information Technology in the Legal System |
Search for
Tomorrow: Some Side Effects of Patent Office Automation, 87 N.C. L.
Rev. 1617 (2009)
The Patent Office’s move to a paperless search facility and the
public’s growing involvement in prior art search have recently
elevated the role of search engine technology in the patent
examination process. This Article reports on an empirical
study of how this technology has systematically changed not only how
patent references are found, but also which patents are cited as
prior art. A longitudinal analysis of an imputed data set
indicates that examiners became increasingly reliant on keyword
full-text search in the late 1990s, as the technology became
accessible from their desktop computers. This change in
examination practice appears to have had a substantive effect on the
choice of patents to be cited as prior art. Specifically,
patent citations imputed to keyword search tend to be co-classified
(according to the Patent Office classification system) more
frequently than patent citations in general and patent citations
imputed to citation tracking methods. These findings support
the concerns of some commentators about Patent Office automation and
the outsourcing of prior art search. In particular, it appears that
the Patent Office classification system is not being fully utilized
to improve the precision of search results. |
|
Asian American Studies |
The 1995 National Asian American
Studies Examination in U.S. High Schools, 21 Amerasia J. 121 (1995)
Because of effective professional networks and extraordinary
individual efforts, a relatively small number of Asian American
Studies departments have had a disproportionate influence on the
formulation of Asian American political values and discourse during
the past decade. Nevertheless, Asian American perspectives are
rarely recognized in most parts of the United States, despite the
continuing growth of Asian American communities in all regions of
the country. The recent proliferation of Asian American
Studies programs beyond the leading universities of the West Coast,
Hawaii, New York and New England is therefore of vital importance.
American high schools are also beginning to provide an exposure to
Asian American perspectives as part of their required multicultural
curricula. More often than not, however, the teachers being asked to
provide these perspectives are unaware of Asian American Studies as
an academic discipline. The National Asian American Studies
Examination is a new initiative to encourage the development of
rigorous programs in multicultural education at the high school
level. As a co-curricular activity, it motivates students and
teachers to engage in a meaningful exploration of the Asian American
experience, even in the absence of administrative support.
The KKK and
Vietnamese Fishermen, in DIVERSTORY (working title) (Frank Wu, ed.,
forthcoming 2002)
This article describes the successful use of an antitrust claim in
litigation in the early 1980s to protect a community of Vietnamese
American fishermen from racial harassment and intimidation by a
private army of white supremacists.
Hate's Harms Persist 25 Years
After Raleigh Murder, Raleigh News & Observer, July 26, 2014
A racially motivated murder in Raleigh 25 years ago led to the first federal conviction of a person for hate crimes against an Asian-American. |
|
Parallel Algorithms and Architectures |
Locality-Preserving Hash Functions for General Purpose Parallel
Computation, 12 Algorithmica 170 (1994)
Consider the problem of efficiently simulating the shared-memory
parallel random access machine (PRAM) model on massively parallel
architectures with physically distributed memory. To prevent
network congestion and memory bank contention, it may be
advantageous to hash the shared memory address space. The
decision on whether or not to use hashing depends on (1) the
communication latency in the network and (2) the locality of memory
accesses in the algorithm.We relate this decision directly to
algorithmic issues by studying the complexity of hashing in the
Block PRAM model of Aggarwal, Chandra and Snir, a shared-memory
model of parallel computation which accounts for communication
locality. For this model, we exhibit a universal family of
hash functions having optimal locality. The complexity of
applying these hash functions to the shared address space of the
Block PRAM (i.e., by permuting data elements) is asymptotically
equivalent to the complexity of performing a square matrix
transpose, and this result is best possible for all pairwise
independent universal hash families. These complexity bounds
provide theoretical evidence that hashing and randomized routing
need not destroy communication locality, addressing an open question
of Valiant.
Virtual
Shared Memory: Algorithms and Complexity, 113 Info. & Computation
199 (1993) (with W.F. McColl)
We consider the Block PRAM model of Aggarwal et al. For a
Block PRAM model with n/log n processors and
communication latency l=O(log n), we show that
prefix sums can be performed in time O(l log n/log
l), but list ranking requires time Ω(l log n);
these bounds are tight. These results justify an intuitive
observation of Gazit et al. that algorithm designers should, when
possible, replace the list ranking procedure with the prefix sums
procedure. We demonstrate the value of this technique in
choosing between two optimal PRAM algorithms for finding the
connected components of dense graphs. We also give theoretical
improvements for integer sorting and many other algorithms based on
prefix sums, and suggest a relationship between the issue of graph
density for the connected components problem and alternative
approaches to integer sorting.
Permutations
on the Block PRAM, 45 Info. Processing Letters 69 (1993)
In present-day parallel computers, the complexity of permuting N
data items in shared memory varies, depending on whether large
blocks can be used for communication. The Block PRAM model of
Aggarwal, Chandra and Snir is unique among shared-memory models of
parallel computation in modeling this phenomenon. We
characterize the Block PRAM complexity of some useful classes of
permutations, improving known results.
Complexity Issues in
General Purpose Parallel Computing, D.Phil. thesis, University
of Oxford (1991)
|
|
Boolean Function Complexity |
On the Depth
Complexity of the Counting Functions, 35 Info. Processing Letters
325 (1990)
We use Karchmer and Wigderson’s characterization of circuit depth in
terms of communication complexity to design shallow Boolean circuits
for the counting functions. We show that the MOD3
counting function on n arguments can be computed by Boolean
networks which contain negations and binary OR- and AND-gates in
depth c log2n, where c ≈ 2.881. This
is an improvement over the obvious depth upper bound of 3 log2n.
We can also design circuits for the MOD5 and MOD11
functions having depth 3.475 log2n and 4.930 log2n,
respectively.
|
|
|
|