by: Brian J. Bocketti

Spring 1997

Everybody knows that corruption thrives in secret places...and we believe it a fair assumption that secrecy means impropriety.

Woodrow Wilson (1856-1924)[1]


It is a Catch-22 situation. On the one hand, the use of "robust" encryption technology by those in the world who would do wrong is a very serious matter with possibly deadly ramifications.[2] On the other hand, by restricting these technologies in the name of public safety and national security the United States government places encumbrances on the very freedoms it is pledged to defend.

For the majority of this decade this debate has progressed and today it continues at its most intensified level. With landmark cases in the court system and what seem like weekly hearings in Congress, encryption policy is reaching its point of definition. For many in the public, this debate may hold little particular significance or interest but for each of us what happens next will have a very personal impact.

Since their implementation, the U.S. encryption policies have labored under the same presumptive suspicions espoused by President Wilson. This has been especially true in regards to the export of encryption technologies. The following paper endeavors to shed light on these policies, their journey and the present state of affairs. In so doing, it will be helpful to discuss the history of the technology and its most recent appearances within our court system. Finally, this paper will update the reader on the current legislation introduced into Congress in 1997.


Encryption is a way to disguise a message intended for a particular person in such a way that only they will benefit from the information.[3] Specifically it is "any procedure to convert plaintext into ciphertext".[4] Plaintext is your unmodified message while ciphertext is the coded and secure information.[5] Using a complex system of mathematical algorithms these encryption techniques are then formulated into "cryptosystems", systems for sending secure ciphertext.[6] Finally, the term for breaking these systems is "cryptanalysis" while "cryptography" is the art of creating them and "Cryptology" is the study of both of these skills.[7]

Encryption has an illustrious history which, at times, has shaped the course of world events . It is widely recognized that Julius Caesar first used encryption techniques on the theory that ignorant messengers can not divulge sensitive information. His system consisted of a series of juxtapositions of the alphabet which turned "A's" into "D's" or "B's" into "E's" and effectively turned the message into a confusing mix of letters. Although simple, Caesar's system was most effective and what started in handwriting is now carried on in seconds by computers and with a series of increasingly complex equations. In fact, the basic theory behind encryption hasn't changed much since Caesar's time. For centuries this encryption remained within the domain of the military or for those individuals capable of providing it for themselves. As the twentieth century dawned, encryption began to find its way into the business sector. For many large companies it became important to secure their information and gain the advantage over their competitors. By the time World War II began several German and Polish companies had begun using mechanical encryption machines and these machines would have a great impact on the way both sides prosecuted the war. The machines themselves were called "rotor machines" and they were built with the express purpose of automating the reorganization of the alphabet.[8] As the war began, intelligence agencies began converting these commercial machines for military use. In Germany, armed forces created a new set of variables and began transmitting secret communiques. This system, dubbed ENIGMA, presented a serious challenge for England and the Allies and their response is one of the remarkable stories of the century.

Led by Alan Turing, a team of specialists worked round the clock to break the German encryption system and tap into the wealth of Wehrmacht information. In 1938, Turing was an eminent logician and mathematician residing at Cambridge where he was living off the proceeds of a King's College Fellowship.[9] Soon after England entered the war, Turing was invited to become part of a cryptanalysis team so secret that little was known about it until more than fifty years later.[10] With him Turing brought a long-nurtured concept and a belief in the viability of artificial intelligence.[11] It was logical to him that a machine could be built to complete practically any task that a human brain could complete. Furthermore, if a machine could be created to provide its operator with ciphertext then most certainly he could build one to decipher it. Turing, working at Bletchley Park based his work on one of the earlier Polish rotor machines called a BOMBE.[12] Using a complex system of algorithms, the team made their new machine nearly fifteen times as fast as its predecessor.[13] Algorithms are the complex combination of mathematical equations which ultimately drive the encryption or decryption as the case may be. I will not dare to set the record straight on algorithms as I do not pretend to understand how they work. In any event, by late 1940 and early 1941 the team was reading Luftwaffe (German Air Command) directives with relative ease using the new Turing-Welchman Bombe.[14] This at a time when England stood alone against the full-force of German air power. It is a reasonable assumption that if Germany gained air superiority, especially over England's southern approaches, that an amphibious invasion of England would follow. With the information provided by Bletchley Park, however, the Royal Air Force (RAF) was able to defeat a numerically superior enemy in the epic "Battle of Britain".

In 1942-43, Turing faced a new challenge as the German U-Boat Command added a new set of complications to the old ENIGMA system. This new cryptosystem, called "fish", was more complex than anything encountered before and for it Turing and his associates would need a new kind of solution. Electronics would provide the answer as the new machine used electrical switching technology to speed the decoding process.[15] In addition, Turing was now able to call on resources from the United States Intelligence community after cooperation was established between the Allies. The result, COLOSSUS, effectively stripped the "Wolfpacks" of the ability to move without detection and restored the allies with the advantage once more.[16] The resulting intelligence made the invasion of France an option as the war entered its fourth year. Sir Harry Hinsley, who worked at Bletchley Park for the entire war, recently estimated that the allied invasion of Europe could have been pushed back to 1946 or 1947.[17] The implications for an extended war could have been great as the atomic bomb was available for use in August of 1945 and President Truman showed that he would unleash its power. Although using atomic weapons on European soil is a completely different dilemma it would still have been a serious consideration. In the Pacific, Turing's work and his cooperation with U.S. armed forces contributed much to U.S. efforts to break the Japanese code. Using this code, Naval Intelligence had provided information which led directly to the pivotal American victory at Midway. So you see in many ways encryption technology had a profound effect on the single most important event of this century.

Perhaps the most significant effect of Turing's work in World War II was the invention of the computer. In an effort to break the German code he had created machines that would complete tasks which had once been assigned to the human intellect. Both the BOMBE and COLOSSUS were applications of Turing's concept of artificial intelligence- the "Turing Machine". After the war, Turing continued to work to combine electronics and decision making, for example attempting to teach his machines to play chess. In 1951, his career was cut short when he was exposed as a homosexual under the discriminatory policies of the English government and as a result lost his security clearance.[18] In 1954, Turing died of cyanide poisoning and was reported to have committed suicide but could quite likely have poisoned himself by mistake.[19]

Turing's legacy is the computer, an invention which has proliferated throughout the world and reaches more and more people everyday. Encryption on the other hand has been slow to filter down. After the war the technology remained with those elites in government and business who could afford to maintain the expense of the systems. In addition, the cryptology which had spawned the main-frame computer now benefited from its ability to mechanize various tasks and groups of information. The result was better encryption technologies each year but with only a limited opportunity for individual use of the technology. And so it remained even into the 1990's.

Today we live in the age of the personal computer. Many in our society have access to the highway of the information age and to things like the E-mail and the Internet. You can pay bills, make reservations or even send a message to the White House through your PC. These opportunities are only the beginning of what's to come and as computer technology becomes more sophisticated it will be crucial to protect one's individual privacy. What many don't know is that big business and government has taken on the responsibility of protecting these interests in the interim. Whether in providing secure ATM's or in insuring private medical records stay private the powerful are using almost unbreakable encryption. A time is coming very soon, however, when encryption will be available to all. Whereas once you needed to be a programmer or hire one in order to use encryption techniques today the tools are being created for the individual PC owner to disguise whatever he or she wants. As you will see the prospect of democratized computer driven encryption has a caused a restrictive knee-jerk reaction from many in the U.S. government which has lessened over time, although not completely. The cases that follow show how three different cryptographers have faced U.S. export policies and how the dispositions of their cases are evidence of a changing atmosphere.


As the United States government began to realize how accessible encryption technology would be, it also realized that many of its enemies would obtain the technology and this would, in turn, weaken national security. Encryption technologies became a target for stringent export restriction. These technologies are now considered so seriously by several federal agencies that they have been listed on the United States Munitions List. This list is mandated under International Traffic in Arms Regulations (ITAR) and is promulgated under the Arms Export Control Act of 1976 (22 U.S.C. 2778) which gives the President the power "to control the import and the export of the defense articles and defense services".[20] These defense "articles" and "services" are what make up the USML and cryptographic materials have been enumerated as one of these munitions. For agencies like the NSA (National Security Agency), who depend on eavesdropping on a global level, the prospect of widespread encryption would make many of its procedures obsolete. The argument is that terrorist organizations and sophisticated criminals would be presented with the unprecedented opportunity to communicate in complete secrecy. Admittedly, stripping agencies like the F.B.I. and NSA of these advantages may have serious side effects. However, as we will see the implementation of AECA and ITAR may have risen to an unconstitutional level in recent years. Each challenges to U.S. Encryption Export Policy, the Karn and Bernstein cases show two very different ideas of where the line should be drawn. The final case, involving Phillip Zimmerman, may be a hint for the future of the enforcement and implementation of encryption policies.

Karn v. Department of State

(925 F.Supp. 1(D.D.C. 1996))

In 1994, Philip Karn, Jr. sought to publish the book Applied Cryptography by Bruce Schneier. The book was a compilation of cryptographic techniques which included, among other things, a history of cryptography and information on encryption algorithms.[21] The book also included information which could be forwarded, upon request, to the reader on a computer disk.[22] Karn knew that in order to go ahead with his plans he would have to obtain federal approval. This permitting process, which is overseen by the State Department's Office of Defense Trade Controls (ODTC), includes several major steps: (1) First, ITAR requires that where speech is involved the applicant must apply for what is called a "commodity jurisdiction" in order to release the restricted information[23], (2) Second, if the "CJ" request is rejected then the applicant will be required to register as an arms dealer under AECA and ITAR.[24] (3) Once duly registered the applicant then has to obtain a license to trade the article or service in question and finally (4) the applicant must clear the respective recipients of the information with the State Department prior to the release of the information.[25] As a result, Karn filed his commodity jurisdiction for the book in February of 1994 and the ODTC replied in March. The ODTC took the stance that it did not wish to regulate the book but that the two disks would need to be dealt with separately. Karn, seeking compliance, then filed a separate "CJ" for the diskettes and the algorithms contained therein. This time the ODTC rebuffed him stating that the international distribution of the information could not go forward without a license because the disks had been "designated as a defense article under Category XIII(b) of the United States Munitions List".[26] This list, promulgated under AECA, includes things like bombers and rockets but Category XIII(b) also includes encryption technologies like Schneier's. Karn then sought to fight this designation and after exhausting his administrative appeals filed suit in federal court in September of 1995.

The complaint asked for judicial review of the ODTC determination and alleged that the permitting process violated the plaintiff's First and Fifth Amendment rights. The court never considered the designation of the encryption as a defense article, however, as Justice Charles R. Richey pointed out that AECA barred such review.[27] AECA section 2778(h) reads:

The designation by the President (or by an official to whom the President's functions under subsection (a) of this section have been duly delegated), in regulations issued under this section, of items as defense articles or services ...shall not be subject to review.

Richey and the district court refused to disturb the ODTC's determination and opined that as long as the State Department had followed the procedures laid out in AECA and ITAR then the designation of the diskettes was not in question.

The court then dealt with Karn's constitutional claims. Karn had argued that the regulatory scheme was a restraint on free speech because it was evidence of government intent to single out cryptography for restriction. Karn also attempted to invoke the Fifth amendment in order to argue that when the ODTC treated the book and disks separately, they had in fact violated his right to due process. The court did not agree and granted the defendant State Department summary judgement on both issues. It noted that the regulations passed the tests laid out by the Supreme Court for the protection of free speech and that they were content-neutral not discriminatory. As for the Fifth Amendment claim the court noted the "due process provided in the Fifth Amendment, absent the assertion of a fundamental right, merely requires a reasonable fit between governmental purpose and the means chosen to advance that purpose.".[28]

The Karn decision was handed down in March of 1996 and it seems indicative of judicial response to the debate over cryptography up until that time. Justice Richey noted that the case was part of a larger trend as it was a "classic example of how courts today, particularly the federal courts, can become needlessly invoked, whether in the national interest or not, in litigation involving policy decisions made within the power of the President".[29] What Richey and others like him overlooked, however, was the fact that with the advent of global networking and Internet transaction, the security provided by encryption was becoming a quasi-fundamental right. If Richey had considered all of the relevant facts he would have understood that encryption technologies will be crucial to the way we conduct our lives in the future and that for that reason alone the Karn decision would not be the last time a federal court fielded such a challenge. Several months later, the Bernstein decision would contrast the Karn court's conservatism.

Bernstein v. U.S. Department of State

(945 F.Supp. 1279 (N.D. Cal. 1996)

In 1992, Daniel Bernstein was a graduate student in mathematics at the University of California at Berkeley.[30] Through his own efforts he had developed an useful new computer program called Snuffle 5.0 which had as its main function encryption techniques for "civil applications".[31] As any ambitious graduate student would do, Bernstein wanted to share his studies with those who would be interested. His plan was to publish his work in hardcopy and on "electronic international networks" and if need be to engage in speaking engagements to discuss them. Like Phil Karn, Bernstein realized that the federal government would want to review any such plans and so he filed for a commodity jurisdiction in June of 1992.[32] After a delay which lasted a year and a half the ODTC denied this request and informed Bernstein that he would need a license in order to release anything. This would require, Bernstein, a math student, to register himself as an international arms dealer under AECA and ITAR before he did anything.[33] Bernstein was left with no reasonable option. Thus, after being backed into a corner, and with his work getting older each day, he filed suit against the State Department, four other agencies and a host of administrators.

Bernstein sought, among other things, injunctive relief and damages in conjunction with the government's regulation of his work. The complaint alleged inherent deficiencies in the AECA/ITAR regulatory scheme and specifically sought redress of certain constitutional violations. It was alleged that these statutes, as implemented, constituted a prior restraint on free speech and that the government had denied Bernstein the right:

to speak, to publish, to assemble to receive information and to engage in academic study, inquiry and publication.[34]

In addition it was alleged that the registration and licensing procedures were unconstitutional as they contained generalizations which did not limit the agencies' powers or fully define an applicant's duties and rights. Bernstein's attorney, Cindy Cohn, pointed out that both the AECA and ITAR were vague and threatened prosecution even though cryptographers could not determine how to comply. Criminal and Civil penalties (which included a $1,000,000 fine), had a paralyzing effect on many mathematician and programmers. These men and women were not sure if what they were creating was to be classified as a "munition" and Cohn argued that these doubts were having a chilling effect on their entire field. If a math student looked at the statutes should he know that he needed to register as an arms dealer? The Bernstein complaint also alleged that the AECA/ITAR system had been implemented in an over broad manner. ITAR's jurisdiction is supposed to apply only to articles "specifically designed, developed, configured, adapted or modified for a military application".[35] Bernstein's intention was never military. As Cohn set out to defend him she needed to show the court how government officials had been overly restrictive. It was probably true that the State Department and the NSA (State Department consultant on AECA/ITAR licenses) felt it better to err on the side of safety. As in World War II, the threat that encryption technologies will be adapted from commercial use is very great. Cohn was arguing, however, that this could not justify such a confining policy. The final counts of Bernstein's complaint dealt with alleged constitutional violations to the plaintiff's freedom to associate and viewpoint discrimination.

Overall, Cohn presented a pretty realistic view of the status quo under AECA/ITAR. After all, Bernstein was an academic who posed no particular threat to the national security and yet he had suffered significantly. It would now be up to the district court to decide where the balance between individual rights and national security would be drawn.

In a lengthy opinion, Judge Patel delivered the court's verdict. Patel noted that the central issue of the case was whether or not encryption policies were inconsistent with First Amendment of the Constitution. The court thus recognizing that the case's implications went well beyond the immediate dispute. The State Department argued that the statutes were acceptable since they were content-neutral and that they should be allowed a certain amount of latitude especially in light of the government's concern for national security. The court disagreed, however, stating this could not justify such restrictive policies:

defendant's interests here, in being able to break foreign encryption and conduct adequate surveillance in "furtherance of world peace and the security of the United States,", are clearly insufficient without more.[36]

The court then found category XIII (b) of the USML to be a prior restraint on the freedoms guaranteed by the first amendment.[37] The court found that in targeting cryptography the licensing process had unacceptably targeted a certain kind of speech. [38] The same court had acknowledged encryption as a form of speech in an earlier Bernstein action.[39] Category XIII (b), if you will recall was completely passed over by the Karn court. The Bernstein result was a more pragmatic consideration of the debate and without the stricken provision the government could not thereafter enforce it against the plaintiff. Finally, the court hinted that the government and Bernstein should reach an agreement whereby Bernstein would be allowed to teach a course in cryptography at the University of Illinois at Chicago in 1997 while he agreed not to post his studies on the Internet. The Internet was of particular concern to the government since it completed in minutes what the AECA/ITAR sought to prevent.

The Zimmerman Investigation

Perhaps the most interesting of the encryption cases over the past two years is the case that never went to trial. The Zimmerman investigation involved the posting of encryption technology on the Internet. Phillip Zimmerman, described as a "soft spoken data security consultant" from Boulder, Colorado had created a software package called Pretty Good Privacy with the intent to "strengthen democracy, to ensure that Americans could continue to protect their privacy."[40] The problem was that "PGP" was more than pretty good and the NSA was meeting with little success decoding it. Federal authorities were considering banning the program when it suddenly appeared on the Internet. Zimmerman had released his program to friends within the industry in 1991-92 and from there they had brazenly placed it on the world wide web. In 1993, the United States Attorney's office served subpoenas on some of these friends, their firms and Zimmerman.[41] For the next two years, Zimmerman withstood government probing but in the end the case was dropped in January of 1996.[42] Although this decision was announced before the verdicts in Karn and Bernstein were settled, it is symptomatic of changing attitudes towards the enforcement of AECA and ITAR. By the end of 1995, it is reasonable to assume that the U.S. Attorney for the Northern District of California, Michael Yamaguchi, could see changes coming. Despite having sufficient evidence against Zimmerman, he chose not to proceed and soon after the Bernstein decision confirmed his instincts. In 1997, the U.S. Congress is trying to decide the future of encryption statutes and to put to rest the debate and uncertainty which exists in the courts and in enforcement agencies.


The "liberalization plan" for encryption technologies was originated in 1996 by President Clinton and by the end of that year several drafts had been introduced into the House and Senate.[43] In 1997, these bills were re-introduced and the Congress has been considering the fate of encryption controls in the hopes that a balanced policy can be agreed upon. There are currently three bills before Congress: Senate Bills 376 and 377 and House Bill 695.

Senate Bill 376, entitled "Encrypted Communications Privacy Act of 1997" is a bill "to affirm the rights of Americans to use and sell encryption products".[44] The bill is intended to enunciate clearly the individual rights which should be protected from regulation by the federal government. In addition, the "key recovery encryptions systems" which, some argue, will strike a balance between privacy rights and national security.[45] These systems provide agencies like the NSA and F.B.I. with the keys necessary to continue surveillance while protecting the individual in their everyday dealings. Senate Bill 377, entitled "Promotion of Commerce On-Line In The Digital Era (Pro-Code) Act of 1997" this bill is intended to boost "electronic commerce" by easing restrictions on strong encryption.[46] In particular, this bill seeks to set up an exclusive jurisdiction for the control of encryption technologies intended for civil application. Under the act, the Secretary of Commerce would control these products but would now be barred from "imposing Government-designed encryption standards on the private sector by restricting the export" of computer technology with "encryption capabilities".[47] There is also a provision which loosens the restrictive licensing procedures of the past, requiring only a general license for those wishing export commercial encryption products.[48] Finally, the statute establishes the Information Security Board which will be charged with formulating and updating a national encryption policy. House Bill 695, is also an amendment bill and could re-write the United States Code (Title 18) to add language establishing personal encryption rights at law.[49] The bill is named the "Security and Freedom Through Encryption (SAFE) Act" and it would make serious advances towards free encryption.[50] Most importantly it would codify the freedom to use and sell encryption and protect citizens from mandatory key escrow.[51] Mandatory key escrow, once again, is the surrendering of encryption keys to the government. This is very important because government agencies without such a system will not be able to penetrate individual privacy at will. As you can see, there is definitely a movement on to loosen the hold on encryption. This movement is not without its opponents, however. Since they were introduced, these bills have been referred to committee and have remained there while the House and Senate field testimony from interested parties.

F.B.I. Director, Louis J. Freeh, has testified on several occasions and has pleaded for a balanced encryption policy. As he puts it, "The enactment of any one of these bills, as drafted, would have a negative effect on National Security".[52] Director Freeh also states that any balanced policy must include key recovery encryption (key escrow), what he has described as "Trusted Third Party" systems.[53] Without such systems he predicts the "adverse consequences... to public safety, crime prevention and effective law enforcement" will be "huge".[54] For the F.B.I., and similar agencies, world-wide recovery systems will strike the proper balance between privacy and their need for effective surveillance. While acknowledging that key recovery systems will not keep powerful encryption from all criminals, Freeh has argued that these systems are the only acceptable option discussed in the current debate.

The key recovery proposal has many critics in Washington. Jerry Berman, one of these critics, laid out the problems with accepting such a system in a speech before the Senate Commerce Committee. Berman is the Executive Director of the Center for Democracy and Technology (CDT). Berman and others believe that key recovery or "escrowed" systems are not justified since the threat to national security is not as bleak as stated by Freeh. Also, the systems will not prevent sophisticated criminals from obtaining the technology but will only work to reduce privacy rights.[55] Such a system would also hamper an individual's ability to transact on a global scale. This is true because the government has yet to show that escrowed transactions will be "globally endorsed".[56] Without such endorsement, many banks or corporations will refuse to communicate electronically because the possibility exists that a third party is capable of observing. For opponents of key recovery, like Berman, the only logical conclusion is an encryption policy which will provide for unrestricted export and development of the technology. In their opinion, there is nothing that would justify keeping this critical protection from the law abiding citizens of the world.


There is no doubting the power of encryption. As you can see, however, whether or not this power should be released, without restriction, is a much harder question to answer. The AECA/ITAR structure is not acceptable in its present form and has become outdated for its intended purpose. Recently, the National Research Council published a report which detailed some of these inadequacies. The NRC noted that challenges similar to Karn and Bernstein:

suggest quite strongly that the traditional National Security paradigm of export controls (one that is based toward denial rather than approval) is stretched greatly by current technology. Put differently, when the export control regime is pushed to an extreme, it appears manifestly ridiculous.[57]

With this in mind, the debate has been reduced to a choice between mandatory or voluntary key recovery systems and unrestricted encryption. Mandatory key recovery would be a drastic and perhaps unconstitutional next step. Freeh and others have argued that the "Genie" is not yet out of the bottle and that controls may still be effective through the use of some form of key recovery.[58] This is not a reasonable assessment. For example, Phil Zimmerman's "PGP" has been downloaded off the Internet thousands of times and it is more than two times as powerful as anything allowed under AECA and ITAR.

The conservative argument is unappealing for other reasons as well. Criminals or terrorist organizations will not be prevented from exposure to the technology. If they don't already have powerful encryption, it may be obtained at a price or created first-hand by friends. In the end, the only ones affected by a world-wide key recovery system will be those acquiescing to the program's requirements. Presumptively, these parties will have nothing to hide and the only product will be the resulting loss of personal privacy.

The hardest concept for government agencies' to accept seems to be that each one of us has a right to the most powerful encryption available. Each day that we pass in the "Information Age" increases the need to secure our personal privacy. Furthermore, as more people trust the Internet to conduct their business and personal matters, it will be up to the government to provide its constituents with the ability to protect themselves. This should not mean that we must surrender our rights in the process. Hopefully, Congress will realize this.


1.   The Concise Columbian Dictionary of Quotations, Columbia Univ. Press, N.Y.,
     1987, pg. 235
 2.  Federal Document Clearing House Inc., Congressional Testimony, July 25, 1996,       
     Testimony of Louis J. Freeh, Director, F.B.I. before the Senate (Lexis-Nexis)
 3.  Cryptography FAQ (03/10:Basic Cryptography), sec. 3.1, (http://www.phys.s.u-        
 4.  Id. at section 3.1
 5.  Id. at section 3.1
 6.  Id. at section 3.1
 7.  Id. at section 3.1
 8.  University of Illinois at Chicago, "Honors Seminar in Cryptography", Spring Semester     
     1997, Jeremy Teitelbaum, Instructor (http://raphael.math.uic.edu/-jeremy/crypt.html)
 9.  Alan Turing Home Page, (http://mrh.slip.netcom.com/)
10.  The English government continues to release these documents a process which began in     
11.  Alan Turing Home Page
12.  Alan Turing Home Page
13.  Alan Turing Home Page
14.  Alan Turing Home Page
15.  Alan Turing Home Page
16.  Alan Turing Home Page
17.  Alan Turing Home Page
18.  Alan Turing Home Page
19.  Alan Turing Home Page
20.  Arms Export Control Act (AECA), 22 United States Code 2778 (a)(1), (Lexis-Nexis)
21.  Karn v. Department of State, 925 F.Supp. 1,3 (D.D.C. 1996), (Lexis-Nexis)
22.  Id. at 3,4
23.  International Traffic in Arms Regulations (ITAR) section 120.4, 22 C.F.R. 120.4 (Lexis-  
24.  AECA 2778 (b)(1)(a) and ITAR 122.1
25.  ITAR section 123, 22 C.F.R. 123
26.  Karn v. Department of State, 925 F.Supp. 1,4 (D.D.C. 1996) (Lexis-Nexis)
27.  Id. at 5
28.  Id. at 13
29.  Id. at 2-3
30.  Bernstein Complaint at paragraph 55, (http:/www.Law.vill.edu/-perritt/berncplt.htm)
31.  Id. at paragraph 58
32.  Id. at paragraph 61
33.  AECA 2778 (b)(1)(a) and ITAR 122.1
34.  Bernstein Complaint at paragraph 97
35.  ITAR 120.3 (a) and (b)
36.  Bernstein v. Department of State, 945 F.Supp. 1279, 1290 (N.D.Cal 1996), (Lexis-Nexis)
37.  Id. at 1292
38.  Id. at 1290
39.  Bernstein v. Department of State, 922 F.Supp. 1426, 1436 (N.D.Cal 1996), (Lexis-Nexis)
40.  U.S. News & World Report, "Lost in Kafka Territory", by Vic Sussman (vic@clark.net)
41.  The New York Times, September 21, 1993, "Federal Inquiry on Software Examines  
     Privacy Programs", by John Markoff, ()
42.  U.S. Attorney Press Release, January 11, 1996 (http://www.info-nation.com      
43.  See: "Encryption and U.S. Export Policies of Cryptographic Products Clipper 3.1.1", by
     Jason Klindworth, (Computers & Law Homepage), Fall 1996
44.  Bill Tracking Report S.376 (Lexis-Nexis)
45.  Bill Tracking Report S.376 (Lexis-Nexis)
46.  Bill Tracking Report, Text of bill S.377 (Lexis-Nexis)
47.  Bill Tracking Report, Text of bill S.377 (Lexis-Nexis)
48.  Bill Tracking Report, Text of bill S.377 (Lexis-Nexis)
49.  Bill Tracking Report, Text of bill H.695 (Lexis-Nexis)
50.  Bill Tracking Report, Text of bill H.695 (Lexis-Nexis)
51.  Bill Tracking Report, Text of bill H.695 (Lexis-Nexis)
52.  Federal Document Clearing House Inc., Congressional Testimony, March 19, 1997
     Testimony of Louis J. Freeh, Director, F.B.I., before Senate Commerce Committee     
53.  Id. at Freeh Testimony
54.  Id. at Freeh Testimony
55.  Federal Document Clearing House Inc., Congressional Testimony, March 19, 1997
          Testimony of Jerry Berman, Executive Director of the Center for Democracy and       
     Technology (CDT), before Senate Commerce Committee (Lexis-Nexis)
56.  Id. at Berman Testimony
57.  National Research Council, National Academy of Sciences, "Cryptography's Role in    
     Securing the Information Society", C-2 (May 30, 1996)
58.  Federal Document Clearing House, Inc., Congressional Testimony, March 19, 1997,     
     Testimony of Louis J. Freeh, Director, F.B.I., before Senate Commerce Committee