Purdue University Cyber Security Blog

  • Reflecting on the Internet Worm at 35
    by spaf@cerias.purdue.edu (spaf) on November 2, 2023 at 4:29 pm

    Thirty-five years ago today (November 2nd), the Internet Worm program was set loose to propagate on the Internet. Noting that now to the computing public (and cybersecurity professionals, specifically) often generates an “Oh, really?” response akin to stating that November 2nd is the anniversary of the inaugural broadcast of the first BBC TV channel (1936), and the launch of Sputnik 2 with Laika aboard (1957). That is, to many, it is ho-hum, ancient history. Perhaps that is to be expected after 35 years — approximately the length of a human generation. (As an aside, I have been teaching at Purdue for 36 years. I have already taught students whose parents had taken one of my classes as a student; in five or so years, I may see students whose grandparents took one of my classes!). In 1988, fewer than 100,000 machines were likely connected to the Internet; thus, only a few thousand people were involved in systems administration and security. For us, the events were more profound, but we are outnumbered by today’s user population; many of us have retired from the field…and more than a few have passed on. Thus, events of decades ago have become ancient history for current users. Nonetheless, the event and its aftermath were profound for those who lived through it. No major security incident had ever occurred on such a scale before. The Worm was the top news story in international media for days. The events retold in Cliff Stoll’s Cuckoo’s Egg were only a few years earlier but had affected far fewer systems. However, that tale of computer espionage heightened concern by authorities in the days following the Worm’s deployment regarding its origin and purpose. It seeded significant changes in law enforcement, defense funding and planning, and how we all looked at interconnectivity. In the following years, malware (and especially non-virus malware) became an increasing problem, from Code Red and Nimda to today’s botnets and ransomware. All of that eventually led to a boom in add-on security measures, resulting in what is now a multi-billion dollar cybersecurity industry. At the time of the Worm, the study of computing security (the term “cybersecurity” had not yet appeared) was primarily based around cryptography, formal verification of program correctness, and limiting covert channels. The Worm illustrated that there was a larger scope needed, although it took additional events (such as the aforementioned worms and malware) to drive the message home. Until the late 1990s, many people still believed cybersecurity was simply a matter of attentive cyber hygiene and not an independent, valid field of study. (I frequently encountered this attitude in academic circles, and was told it was present in the discussion leading to my tenure. That may seem difficult to believe today, but should not be surprising: Purdue has the oldest degree-granting CS department [60 years old this year], and it was initially viewed by some as simply glorified accounting! It is often the case that outsiders dismiss an emerging discipline as trivial or irrelevant.) The Worm provided us with an object lesson about many issues that, unfortunately, were not heeded in full to this day. That multi-billion dollar cybersecurity industry is still failing to protect far too many of our systems. Among those lessons: Interconnected systems with long-lasting access (e.g., .rshrc files) created a playground for lateral movement across enterprises. We knew then that good security practice involved fully mediated access (now often referred to as “Zero Trust”) and had known that for some time. However, convenience was viewed as more important than security…a problem that continues to vex us to this day. We continue to build systems that both enable effortless lateral movement, and make it difficult or annoying for users to reauthenticate, thus leading them to bypass the checks. Systems without separation of privilege facilitated the spread of malware. Current attackers who manage to penetrate key services or privileged accounts are able to gain broader access to entire networks, including the ability to shut off monitoring and updates. We have proven methods of limiting access (SELinux is one example) but they are too infrequently used. Sharing information across organizations can result in a more robust, more timely response. Today, we still have organizations that refuse to disclose if they have been compromised, thus delaying our societal response; information obtained by government agencies has too often been classified, or at least closely held.. The information that is shared is frequently incomplete or not timely. The use of type-unsafe languages with minimal security features can lead to flaws that may be exploited. One only needs to survey recent CVE entries and attack reports to see buffer overflows, type mismatches, and other well-known software flaws leading to compromise. Many organizations are still producing or reusing software written in C or C++ that are especially prone to such errors. Sadly, higher education is complicit by teaching those languages as primary, mainly because their graduates may not be employable without them. Heterogenity of systems provides some bulwark against common attacks. Since 1988, the number of standard operating systems in use has decreased, as has the underlying machine architectures. There are clearly economic arguments for reduced numbers of platforms, but the homogeneity facilitates common attacks. Consideration of when to reuse and when to build new is sadly infrequent. The Worm incident generated conflicting signals about the propriety of hacking into other people’s systems and writing malware. Some people who knew the Worm’s author rose to his defense, claiming he was demonstrating security problems and not doing anything wrong. Malware authors and system attackers commonly made that same claim in the decades following, with mixed responses from the community. It still colors the thinking of many in the field, justifying some very dubious behavior as somehow justified by results. Although there is nuance in some discussions, the grey areas around pen testing, companies selling spyware, and “ethical” hacking still enable plausible explanations for bad behavior. That last point is important as we debate the dangers and adverse side-effects of machine learning/LLM/AI systems. Those are being refined and deployed by people claiming they are not responsible for the (mis)use of (or errors in) those systems and that their economic potential outweighs any social costs. We have failed to clearly understand and internalize that not everything that can be done should be done, especially in the Internet at large. This is an issue that keeps coming up and we continue to fail to address it properly. As a field, cybersecurity is relatively young. We have a history that arguably starts in the 1960s with the Ware Report. We are still discovering what is involved in protecting systems, data privacy, and safety. Heck, we still need a commonly accepted definition of what cybersecurity entails! (Cf. Chapter 1 of the Cybersecurity Myths book, referenced below.). The first cybersecurity degree program wasn’t established until 2000 (at Purdue). We still lack useful metrics to know whether we are making significant progress and titrate investment. And we are still struggling with tools and techniques to create and maintain secure systems. All this while the market (and thus need) is expanding globally. In that context of growth and need, we should not dismiss the past as “Ho-hum, history.” Members of the military study historic battles to avoid future mistakes: mentioning the Punic Wars or The Battle of Thermopylae to such a scholar will not result in dismissal with “Not relevant.” If you are interested in cybersecurity, it would be advisable to study some history of the field and think about lessons learned — and unlearned. Further Reading The Ware Report This can be seen as one of the first descriptions of cybersecurity challenges, needs and approaches. The protection of information in computer systems A paper from 1975 by J.H. Saltzer and M.D. Schroeder. This paper refers to basic design principles, in large part inspired by Multics, that include complete mediation (now somewhat captured by “Zero Trust”) and least privilege. These are most often violated by software rather than designed in, especially economy of mechanism. (Versions of this paper may be found outside the paywall via web search engines.) Historical papers archive A collection of historical papers presenting the early foundation of cybersecurity. This includes the Ware Report, and its follow-on, the Anderson Report. Some other, hard-to-find items are here. The Communications of the ACM Worm Issue An issue of CACM was devoted to papers about the Worm. The Internet Worm: An Analysis My full report analyzing what the Worm program did and how it was structured. The Internet Worm Incident A report describing the timeline of the Worm release, spread, discovery, and response. Happy birthday, dear viruses This is a short article in Science I coauthored with Richard Ford for the 25th anniversary of the Worm, about malware generally. Cybersecurity Myths and Misconceptions A new book about things the public and even cybersecurity experts mistakenly believe about cybersecurity. Chapter 1 addresses, in depth, how we do not have an accepted definition of cybersecurity or metrics to measure it. Other items alluded to in this blog post are also addressed in the book. Cyber security challenges and windmills One of my blog posts, from 2009, about how we continue to generate studies of what would improve cybersecurity and then completely fail to heed them. The situation has not improved in the years since then.

  • AI and ML Sturm und Drang
    by spaf@cerias.purdue.edu (spaf) on June 5, 2023 at 10:07 pm

    ďťżI recently wrote up some thoughts on the current hype around ML and AI. I sent it to the Risks Digest. Peter Neumann (the moderator) published a much-abbreviated version. This is the complete set of comments. There is a massive miasma of hype and misinformation around topics related to AI, ML, and chat programs and how they might be used…or misused. I remember previous hype cycles around 5th-generation systems, robotics, and automatic language translation (as examples). The enthusiasm each time resulted in some advancements that weren’t as profound as predicted. That enthusiasm faded as limitations became apparent and new bright, shiny technologies appeared to be chased. The current hype seems even more frantic for several reasons, not least of which is that there are many more potential market opportunities for recent developments. Perhaps the entities that see new AI systems as a way to reduce expenses by cutting headcount and replacing people with AI are one of the biggest drivers causing both enthusiasm and concern (see, for example, this article). That was a driver of the robotics craze some years back, too. The current cycle has already had an impact on some creative media, including being an issue of contention in the media writers’ strike in the US. It also is raising serious questions in academia, politics, and the military. There’s also the usual hype cycle FOMO (fear of missing out) and the urge to be among the early adopters, as well as those speculating about the most severe forms of misuse. That has led to all sorts of predictions of outlandish capabilities and dire doom scenarios — neither of which is likely wholly accurate. AI, generally, is still a developing field and will produce some real benefits over time. The limitations of today’s systems may or may not be present in future systems. However, there are many caveats about the systems we have now and those that may be available soon that justify genuine concern. First, LLMs such as ChatGPT, Bard, et al. are not really “intelligent.” They are a form of statistical inference based on a massive ingest of data. That is why LLMs “hallucinate” — they produce output that matches their statistical model, possibly with some limited policy shaping. They are not applying any form of “reasoning,” as we define it. As noted in a footnote in my recent book, Philosophically, we are not fond of the terms ‘artificial intelligence’ and ‘machine learning,’ either. Scholars do not have a good definition of intelligence and do not understand consciousness and learning. The terms have caught on as a shorthand for ‘Developing algorithms and systems enhanced by repeated exposure to inputs to operate in a manner suggesting directed selection.’ We fully admit that some systems seem brighter than, say, certain current members of Congress, but we would not label either as intelligent. I recommend reading this and this for some other views on this topic. (And, of course, buy and read at least one copy of Cybermyths and Misconceptions. Depending on the data used to build their models, LLMs and other ML systems may contain biases and produce outright falsehoods. There are many examples of this issue, which is not new: bias in chatbots (e.g., Microsoft Tay turning racist, bias in court sentencing recommendation systems, and bias in facial recognition systems such as discussed in the movie Coded Bias ). More recently, there have been reports showing racial, religious, and gender biases in versions of ChatGPT (as example, this story). “Hallucinations” of non-existent facts in chatbot output are well-known. Beyond biases and errors in chats, one can also find all sorts of horror stories about autonomous vehicles, including several resulting in deaths and serious injuries because they aren’t comprehensive enough for their uses. These limitations are based on how the systems are trained. However, it is also possible to “poison” these systems on purpose by feeding them bad information or triggering the recall of biased information. This is an area of burgeoning study, especially within the security community. Given that encoded systems derived in these large ML models cannot be easily reversed to understand precisely what causes certain decisions to be made (often referred to as “explainable AI”), there are significant concerns about inserting these systems in critical paths. Second, these systems are not accountable in current practice and law. If a machine learning system (I’ll use that term but cf my 2nd paragraph) comes up with an action that results in harm, we do not have a clear path of accountability/responsibility. For instance, who should be held at fault if an autonomous vehicle were to run down a child? It is not an “accident” in the sense that it could not be anticipated. Do we assign responsibility to the owner of the vehicle? The programmers? The testers? The stockholders of the vendor? We cannot say that “no one” is responsible because that leaves us without recourse to force a fix of any underlying problems, of potential recompense to the victims, and to general awareness for the public. Suppose we use such systems safety or correctness-critical systems (and I would put voting, healthcare, law enforcement, and finance as exemplars). In that case, it will be tempting for parties to say, “The computer did it,” rather than assign actual accountability. That is obviously unacceptable: We should not allow that to occur. The price of progress should not be to absolve everyone of poor decisions (or bad faith). So who do we blame? Third, the inability of much of the general public to understand the limitations of current systems means that any use may introduce a bias into how people make their own decisions and choices. This could be random, or it could be manipulated; either way, it is dangerous. It could be anything from gentle marketing via recency effects and priming all the way to Newspeak and propaganda. The further towards propaganda we go, the worse the outcome may be. Who draws the line, and where is it drawn? One argument is, “If you train humans on rampant misinformation, they would be completely biased as well, so how is this different?” Well, yes — we see that regularly, which is why we have problems with Q-anon, vaccine deniers, and sovereign citizens (among other problem groups). They are social hazards that endanger all of us. We should seek ways to reduce misinformation rather than increase it. The propaganda that is out there now is only likely to get worse when chatbots and LLMs are put to work, producing biased and false information. This has already been seen (e.g., this story about deepfakes), and there is considerable concern about the harm this can bring. Democracy is intended to work best when the voters have access to accurate information. The rising use of these new generative AI systems is already raising the specter of more propaganda, including deep-fake videos. Another problem with some generative systems (artwork, generating novels, programming) is that they are trained on information that might have restrictions, such as copyright. This raises some important questions about ownership, creativity, and our whole notion of issues of rule of law; the problems of correctness and accountability remain. There is some merit to the claim that systems trained on (for example) art by human artists may be copying some of that art in an unauthorized manner. That may seem silly to some technologists, but we’ve seen lawsuits successfully executed against music composers alleged to have heard a copyrighted tune at some point in the past. The point is the law (and perhaps more importantly, what is fair) is not yet conclusively decided in this realm. And what of leakage? We’re already seeing cases where some LLM systems are ingesting the questions and materials people give them to generate output. This has resulted in sensitive and trade secret materials being taken into these databases…and possibly discoverable by others with the right prompting (e.g., this incident at Samsung). What of classified material? Law enforcement sensitive material? Material protected by health privacy laws? What happens for models that are used internationally when the laws are not uniform? Imagine the first “Right to be forgotten” lawsuits against data in LLMs. There are many questions yet to be decided, and it would be folly to assume that computing technologists have thoroughly explored these issues and designed around them. As I wrote at the beginning, there are potential good uses for some of these systems, and what they are now is different from what they will be in, for example, a decade. However, the underlying problem is what I have been calling “The Trek futurists” — they see all technology being used wisely to lead us to a future roughly like in Star Trek. However, humanity is full of venal, greedy, and sociopathic individuals who are more likely to use technology to lead us to a “Blade Runner” future … or worse. And that is not considering the errors, misunderstandings, and limitations surrounding the technology (and known to RISKS readers). If we continue to focus on what the technology might enable instead of the reality of how it will be (mis)used, we are in for some tough times. One of the more recent examples of this general lack of technical foresight is cryptocurrencies. They were touted as leading to a more democratic and decentralized economy. However, some of the highest volumes of uses to date are money laundering, illicit marketplaces (narcotics, weapons, human trafficking, etc.), ransomware payments, financial fraud, and damage to the environment. What valid uses of cryptocurrency there might be (if there are any) seem heavily outweighed by the antisocial uses. We should not dismiss, out of hand, warnings about new technologies and accuse those advocating caution as “Luddites.” Indeed, there are risks to not developing new technologies. However, the more significant risk may be assuming that only the well-intentioned will use them.

  • Reflections on the 2023 RSA Conference
    by spaf@cerias.purdue.edu (spaf) on April 28, 2023 at 2:26 pm

    I have attended 14 of the last 22 RSA conferences. (I missed the last three because of COVID avoidance; many people I know who went became infected and contributed to making them superspreader events. I saw extremely few masks this year, so I will not be surprised to hear of another surge. I spent all my time on the floor and in crowds with a mask — I hope that was sufficient.) I have blogged here about previous iterations of the conference (2007, 2014, 2016, and most recently, 2019). Reading back over those accounts makes me realize that little has really changed. Some of the emphasis has changed, but most of what is exhibited and presented is not novel nor does it address root causes of our problems. Each year, I treasure meeting with old friends and making some worthwhile new acquaintances with people who actually have a clue (or two). Sadly, the number of people I stop to chat with who don’t have the vaguest idea about the fundamentals of the field or its history continue to constitute the majority. How can the field really progress if the technical people don’t really have a clue what is actually known about security (as opposed to known about the products in their market segment)? I was relieved to not see hype about blockchain (ugh!) or threat intelligence. Those were fads a few years ago. Apparently, hype around quantum and LLMs has not yet begun to build in this community. Zero trust and SBOM were also understated themes, thankfully. I did see more hardware-based security, some on OT, and a little more on user privacy. All were under-represented. My comments on the 2019 RSAC could be used almost word-for-word here. Rather than do that, I strongly suggest you revisit those comments now. Why did I go if I think it was so uninspiring? As usual, it was for people. Also, this year, I was on a panel for our recent book, Cybersecurity Myths and Misconceptions.. Obviously, I have a bias here, but I think the book addresses a lot of the problems I am noting with the conference. We had a good turnout at the panel session, which was good, but almost no one showed up at the book signings. I hope that isn’t a sign that the book is being ignored, but considering it isn’t hyping disaster or a particular set of products, perhaps that is what is happening. Thankfully, some of the more senior and knowledgable people in the field did come by for copies or to chat, so there is at least that. (I suggest that after you reread my 2019 comments, you get a copy of the book and think about addressing some of the real problems in the field.) Will I go to the 2024 RSAC Conference? It depends on my health and whether I can find funds to cover the costs: It is expensive to attend, and academics don’t have expense accounts. If I don’t go, I will surely miss seeing some of the people who I’ve gotten to know and respect over the years. However, judging by how many made an effort to find me and how the industry seems to be going, I doubt will be missed if I am not there. That by itself may be enough reason to plan an alternate vacation

  • Interview with Spaf at S4x23
    by spaf@cerias.purdue.edu (spaf) on April 10, 2023 at 9:58 pm

    If you didn’t get a chance to attend S4x23 to hear the talks, or you simply haven’t heard enough from Spaf yet, here is a recording of the keynote interview with Spaf by Dale Peterson. The interview covered a lot of ground about the nature of defensive security, the new Cybermyths book (got yours yet?), OT security, the scope of security understanding, having too much information, and having a good security mindset.This and other interviews and talks Spaf has given are on the Professor Spaf YouTube channel.  

  • Serious CERIAS Recognition
    by spaf@cerias.purdue.edu (spaf) on March 30, 2023 at 9:37 pm

    At the 25th anniversary CERIAS Symposium on March 29, we made a special awards presentation. Unfortunately, I had lost my voice. Joel Rasmus read my remarks (included in what follows). I want to stress that these comments were heartfelt from all of us, especially me. 25 years ago, I agreed to start something new—something, unlike anything that had existed at Purdue before. I soon discovered that it was unlike any other academic center others had encountered: a multidisciplinary center built around the concept of increasing the security and safety of people by addressing problems from, and with, computing. I note that I wasn’t the only faculty member involved. Core faculty at the time were Sam Wagstaff,  Mike Atallah, and Carla Brodley, then in our School of ECE.  Sam and Mike have been steady contributors for more than 25 years (stretching back to the pre-CERIAS, COAST days); as an Emeritus Professor, Sam is still working with us. I knew I needed help making the new entity succeed. My first step was hiring some great staff—Andra Nelson (now Martinez) and Steve Hare were the first two new hires; the late Marlene Walls was already working for me. Those three played a huge role in getting CERIAS running and helping with an initial strategic plan. We have recognized them in the past (and will feature them prominently in the history of CERIAS when I get around to writing it). I quickly followed those hires by organizing an advisory board. Some of the members were personnel from the organizations that were committed to supporting us. Others were people in senior positions in various agencies and companies. And a few were friends who worked in related areas. Those choices seem to have worked out pretty well. CERIAS grew from four involved faculty in April 1998 to (as of March 2023) 163. We went from four supporting companies and agencies to two dozen. We have thousands of alumni and worldwide recognition. There is considerable momentum for excellence and growth in the years to come. CERIAS has benefited from the counsel, support, and leadership of scores of wonderful people from strategic partner organizations who served on the External Advisory Board over the years. However, some particularly stand out because they went above and beyond in their efforts to help CERIAS succeed. On this special occasion of our 25th anniversary,  we recognize six exceptional advisors who helped CERIAS succeed and be what it is today.  (Unfortunately, due to various issues, none were present at the Symposium in person to receive the awards. This post is to share with everyone else how much we value their history with us.) Silver Medals We are bestowing five silver Foundation Award Medals to these individuals: Dr. Sidney Karin. Sid was a founder of the National Supercomputer Center program and was the founder and director of the Supercomputing Center at San Diego. He was a pioneer in that field and has received numerous recognitions for his leadership in supercomputing and networking. Sid graciously volunteered his time and tremendous expertise to sit on our advisory board for our formative years, providing insight into structuring and running an academic center. David Ladd.  David was (and is) with Microsoft, (then) working in university support and cybersecurity. He volunteered for our board and served as one of the rotating chairs. He also organized strong support from Microsoft, ensuring we had equipment, guest speakers, and internships for our students. He was a voice for Microsoft and industry, but more importantly, a strong voice for practical research. John Richardson.  John was with the Intel Corporation and an enthusiastic supporter of CERIAS. He also served as one of the rotating chairs as a member of the EAB. John went above and beyond to help secure guest speakers, equipment, student internships, and other companies’ support. He also put strong research and the welfare of the students ahead of his company’s interests. Dr.Robert E. (Bob) Roberts. Bob was the Chief Scientist for the Institute for Defense Analyses (IDA), an FFRDC well-known to those in government.  He provided great wisdom as a member of our EAB, including deep insights into understanding some conflicting requirements within the government. By training, he wasn’t a computer scientist, but his breadth of knowledge across many scientific disciplines helped us navigate many of our multidisciplinary issues. The late Emil Sarpa, the Manager of External Relations at Sun Microsystems. Emil did not serve on the board, but he was constantly present, ensuring that CERIAS had every computing resource we could need from Sun Microsystems, including many items in pre-release. He helped make introductions in the industry and got our students into fantastic opportunities. His support began pre-CERIAS with one of the initial grants that started the COAST Laboratory, and he ensured that Sun was CERIAS’s biggest founding partner. These five people provided assistance above and beyond what we expected, and we will be forever grateful. Gold Medal We had one final, special award. Timothy Grance has been a mainstay at NIST (National Institute for Standards and Technology) for decades. You can find his name on many of the reports and standards NIST has issued and other computing and cybersecurity activities. He’s not as well known as many of our advisors because he prefers to provide quiet, steady contributions. Most importantly to CERIAS, Tim has great vision and is one of the rare people who can find ways to help others work together to solve problems. He is inspirational, thoughtful, and cares deeply about the future. These qualities have undoubtedly been useful in his job at NIST, but he brought those same skills to work for CERIAS at Purdue and even before as an advisor to COAST. For the last 25 years, Tim was (and continues to be) an honored member of the External Advisory Board. He has attended countless board meetings and events over the years — all at his personal expense. He made introductions for us across a wide variety of institutions—academic, governmental, and commercial—and hosted some of the EAB meetings. He has always provided sage advice, great direction, and quiet support for all we have done.  Despite being somewhat limited by a significant stroke a few years ago, he fought back courageously and returned to CERIAS for our Symposium and Board meeting. We reserve a chair for him even when he cannot travel to be with us.  Tim’s commitment to the field, especially to CERIAS, make him a national treasure. We are proud also to consider him a CERIAS treasure, and thus award the Gold Foundation Award Medal to Timothy Grance. Thank you We conclude with sincere thanks, not only to these six wonderful people, but to all those who, over the years, have provided support, advice, time, equipment, funding, problem sets, and simply good cheer. That CERIAS has made it 25 years successfully and continues to grow and innovate is a testament to the importance of the problems and the willingness of such a large community to help address them. Time has only grown the problem set, but everyone associated with CERIAS is ready and willing to take them on. We all look forward to continuing our engagement with the community in doing so!

  • Malicious Compliance: A story
    by spaf@cerias.purdue.edu (spaf) on February 3, 2023 at 3:05 pm

    I recently saw an account of malicious compliance recounted in r/eddit and quoted in a Mastodon thread Not allowed to work from home so I don’t My job recently told me that even during the snowstorm we got earlier this week, I am not allowed to work from home at all. Even though I work in IT and do everything remotely, they want me in the office. So I deleted Teams and my email off my phone. I am no longer available after hours. My boss tried to call me for something urgent last night and couldn’t reach me. He asked why today and I explained to him what I was told. I am not allowed to work from home. It prompted me to think of several instances where I have engaged in behavior that might be described as malicious compliance; I prefer to think of them as instances of “security compliance education.” Here’s one such instance that my students see enjoy hearing about. In 2000, we got some funding from a US federal agency (which will be unnamed) to explore for potential vulnerabilities in a commercial printer/copier combination. My technical point of contact (POC) told me that we didn’t need to file any reports until we had some results. Apparently, he didn’t convey this to the agency business person because the contract specified a long, convoluted monthly report. I was forcibly reminded of this requirement a week after the contract was finalized, even though it was in the midst of the winter break, and absolutely nothing had happened — or would happen, for at least another month. I grumbled a bit but compiled the report with basically “nothing to report” and “nothing spent” in the various sections and uploaded it via FTP to their designated site as a PDF. Now, it is important to this story that my standard computers for use at the time were Sun workstations and Macintosh systems. Most of the research we did was on these systems, and our papers and reports were produced using LaTeX. We avoided Windows because it was usually so buggy (blue screens) and so prone to security problems. We also avoided Word because (a) it was (and is) annoying, and (b) it was a common vector for computer viruses. Thus, my monthly report was produced using LaTeX. Two weeks into the semester, I got an email from some clerk at the sponsoring agency noting that the monthly report must be submitted as a Word document; the contract specified Word and only Word, and I must submit the report as a Word document, with no deviation allowed. I placed a call to my POC, and he indicated, apologetically, that he could not alter the terms as they were standard for the agency involved: everyone had to abide by them. Grrrrr…. So, after a little thought,1 I produced the next monthly report in LaTeX as before. I produced a PDF of the report and printed it. Then, I scanned each sheet individually into a graphic file (.pic, as I recall). I then rebooted one of our Windows machines2 into MS-DOS and loaded up the oldest version of MS Word I could locate. After consulting the manual, I created a document where each page contained an image — the corresponding image for that page of the report I had prepared. I saved it out to disk (it was huge), and uploaded it to the sponsor FTP site. Yes, it was basically a huge file of graphic images, but it was technically a Word file. The next day I got an automated response noting the submission. Three days later, I got an email asking if the report was what I actually intended to upload. I responded that yes, it was. I indicated it had all the required information and was most definitely a Word document. I also alerted my POC about the upload (he was amused). Another few days later and I got email from the original person who had complained about the PDF now complaining they were having difficulty with the file. I responded that the contract required Word, and that is what I used — I wasn’t responsible for their IT issues. In month 3, I went through the same procedure but didn’t have the email exchanges. Purdue then got an email from the agency business office stating that they were altering their standard business practices to allow all contractor reports to be submitted in Word -or- PDF. Would we mind submitting PDF henceforth? I briefly weighed the idea of continuing my production of Word versions of the report but decided that changing the business practices of a whole federal agency was enough. Footnotes: 1. Someone once asked me why I didn’t send them a Word document with some mischevious macros. I replied “USC 18 § 1030” (that’s the Computer Fraud and Abuse Act). 2. Microsoft was a CERIAS partner at the time. When their rep visited, he saw that the lab was equipped with only Sun machines and Macintoshes. A few weeks later, we had several nice servers with Windows preinstalled delivered to the CERIAS lab. All our existing systems were named after mythical and fictional places (e.g., Amber, Oz, Dorsai, Uqbar), and we wanted to continue that scheme. We collectively decided to name the new machines Hel, Tartarus, and Niflheim. When he next visited and saw the machines, with nametags attached, he smiled a little. Two weeks later, we got another three, and they got related names; I can’t recall exactly, but I think they were Underworld, Mictlan, and Jahannam). At his next visit, he remarked he could send us a lot more machines. I said we’d find a home for them, and welcome the chance to engage more of our philosophy, history, and literature faculty in the process. All that said, we actually had a great working relationship with MS, and they hired a lot of our graduates. The machines did get a lot of use in experiments and classes.

  • Three podcasts with Spaf
    by spaf@cerias.purdue.edu (spaf) on December 15, 2022 at 4:02 pm

    If you haven’t reached your quota yet for hearing from Santa Spaf, here are three recent podcasts where I was interviewed on a variety of topics. One common theme: The role of people in cybersecurity. A second theme: Some future trends. How Today’s Technology Choices Could Shape Our Future with Caroline Wong in the Cobalt.io podcast, Humans of Infosec. Cybersecurity Myths & Misconceptions: Avoiding The Pitfalls with Todd Fitgerald in the CIS Stories podcast. They’re Young, Green, and Very Hackable with David Spark and Mike Johnson in the CISO Series Podcast.

  • Spaf Interviewed About His New Book
    by spaf@cerias.purdue.edu (spaf) on December 13, 2022 at 3:18 pm

    In the 100th episode of CISO Stories: Discussion with Gene Spafford on some of the common cybersecurity myths and how to better cope with the changing environment. Join here. For those of you interested in more info on the book discussed in the podcast, see this InformIT site. If you preorder now, you can get a 35% discount with code CYBERMM. A longer info sheet is available here.

  • 2022 ISSA Honorees
    by spaf@cerias.purdue.edu (spaf) on August 9, 2022 at 3:58 pm

    Cybersecurity and privacy have several notable professional associations associated with them. Some, such as ACM, the IEEE Computer Society, and IFIP are more generally about computing. One of the societies specifically directed to cybersecurity is the ISSA — the Information Systems Security Association International. ISSA promotes the development and standards of the profession, globally. Each year, ISSA recognizes individuals who have made significant contributions to the association and to the field overall. In prior years, both Professor Elisa Bertino and Professor Eugene Spafford have been recognized by ISSA: both have been inducted into the ISSA Hall of Fame, and Spaf has been named as a Distinguished Fellow of the organization. ISSA has announced its 2022 honorees. Our congratulations to all these people for their accomplishments and this recognition! Of particular note, three of the honorees have spoken in CERIAS seminars and events: Matt Bishop was named to the ISSA Hall of Fame Caroline Wong was given the ISSA President’s Award for Public Service Dale Meyerrose was named to the ISSA Hall of Fame We also note the ISSA Education Foundation, which supports scholarships for students in the field. Two of those scholarships are in memory of individuals who were long-time friends of CERIAS, Howard Schmidt and Gene Schultz. The recent give-away of Spaf’s coffee mugs raised over $1000 for those scholarships. We encourage others to consider contributing to the foundation to support worthy students. Also, the ISSAEF is an Amazon Smile participant, so that is a painless way for you to make ongoing donations (see the ISSAEF page for a link).

  • Get some CERIAS and Spaf Swag!
    by spaf@cerias.purdue.edu (spaf) on June 20, 2022 at 1:52 am

    [This opportunity is now closed. You can still donate to the listed charities, though!] Want to get some authentic CERIAS and Spaf swag? Read on! CERIAS offices are moving in a matter of weeks. We don’t want to have to box up everything, especially items we aren’t likely to use any time soon (if ever) at the new location. Plus, some of these things are items we’ve heard people might like to have for themselves. So…. We’re going to give some of it away! What’s the catch? Well, we want to encourage people to do something good for others. And, as an institute (CERIAS) at a university (Purdue) and a life-long educator (Spaf), we thought that helping deserving students get cybersecurity education would be the way to do that. Qualifying To get some of the swag, as listed below, you need to make a donation to one of these charitable scholarships no later than August 5th, and provide proof of the donation and amount. We’ll then package up your gifts and send them (we’ll cover shipping inside the United States; if you are outside the U.S. we’ll need to negotiate the shipping and any customs). What charities? Only some of the best for cybersecurity students, and all established in memory of some pioneers in the field: Rebecca Gurley Bace Scholarship ACSA/SWSIS You may donate by sending a check to: Applied Computer Security Associates, Inc. c/o David Balenson P.O. Box 1607 Olney, MD 20830-1607 Philippe Courtot/Gene Schultz/Howard Schmidt/Shon Harris Scholarships These are all administered by the ISSA Educational Foundation. You can donate online or by check; instructions are posted here. We’ll note here that these are also worthwhile for regular donations. As non-profits, there may be tax advantages to your donations. And be sure to check if your employer has a matching donation program! Swag We have established the donation. What’s the swag? While supplies last: From about 1995-2015, Spaf would collect coffee mugs from places he was invited to speak. This includes mugs from Facebook and Google to the NRO and NSA. The collection includes some from locations outside the U.S.A. as well. Currently, there are over 80 of these in the collection plus about 20 CERIAS coffee mugs, including some of the rare 10th anniversary mugs (from 2008). CERIAS branded items that we obtained to give as speaker gifts. We have only a few of each item left, including luggage tags, T-shirts, portfolios, umbrellas, jackets, and some electronic doo-dads. CERIAS/Spaf challenge coins! Some first-printing, never opened, copies of Web Security, Privacy and Commerce, 2nd Edition. If you get a copy, Spaf will autograph it for you! How to Get Some First, make a qualifying donation to one of the charities. Send proof of the donation, your address, and your shirt/hoodie/jacket size to: <spaf@cerias.purdue.edu>. You’ll get the listed items while supplies last. If we run out of an item we will substitute an item of equal or better value. Minimum donation Items shipped $150 2 of Spaf’s coffee mugs plus a CERIAS challenge coin $200 An additional item of CERIAS-branded merchandise plus 1 CERIAS mug $300 A copy of the book in addition to the above, plus an additional CERIAS item. $500All of the above, plus 2 additional coffee mugs, plus a CERIAS logo fabric briefcase or portfolio. In the above, items in each line include all the items in the previous rows. So, If you make a donation of $300 you will also get the items listed for $150 and $200. Remember, these are really all extra gifts. The real value is you making a donation to a worthwhile charity to help some deserving people study cybersecurity! Surprise Bonus! While cleaning out the storage closet we found a dozen remaining Spaf bobbleheads. This is the last of this collector’s item! To get one, send us a check for a minimum of $100 made out to “Purdue University” with “Donation to CERIAS” on the memo line. (And, to be clear, Purdue University is also a non-profit entity.) Send the check with your return address to: Bobblehead c/o Shawn Huddy CERIAS — Purdue University 656 Oval Drive West Lafayette, IN 47907-2086

  • CERIAS is on the move!
    by spaf@cerias.purdue.edu (spaf) on June 13, 2022 at 9:28 pm

    In May of 1998, Purdue chartered CERIAS — the Center of Education and Research In Information Assurance and Security — as a campus-wide, multidisciplinary center for the new (at the time) field of cybersecurity. CERIAS grew out of the COAST Laboratory in the CS department. Our original core of a half-dozen faculty spanned several of the departments and colleges at Purdue and thus warranted a university-level institute. The Recitation Building As part of its commitment to the new center, Purdue University renovated most of the 2nd floor of the Recitation Building on the central campus for CERIAS. The 2nd floor was originally classrooms. We moved into the redone space in early 1999. The space involved a conference room, a small library, a small kitchen and lounge, offices for 10 faculty and staff, and a half-dozen shared offices for grad students. We also had two dedicated rooms on the 4th floor — one as a protected machine room, and one as a lab. The space in REC has served us well since then. We were located near the CS, ECE, and CNIT departments, in a building with great character, including flooring made from birds-eye maple planks. We also had to cope with some of the idiosyncrasies of an older building, including cranky HVAC and leaky pipes. (Recitation was originally completed in 1923.) CERIAS grew into a world-renowned entity with over 150 associated faculty across campus and many hundreds of their students. The Convergence Center Over the last few years in particular, Purdue overall has prospered, with increasing prestige and growing enrollment. Last year, Purdue had over 50,000 students enrolled at the main campus! Having frozen tuition for a decade has undoubtedly helped to make a Purdue education even more attractive, despite the increasingly-rigorous admission standards. There has been an associated boom in new buildings — with over two dozen new dormitories, laboratories, and co-working space for collaborations with companies and national labs. This has included construction of the Discovery Park District. Last year, as part of a master planning process, university administration decreed some reorganization. Many administrative and academic programs are moving to accomodate growth, move related groups near each other, and make better use of space. That includes CERIAS! As of August 15, we are bidding adieu to Recitation. For the following 4 months we will be virtual as our new “galactic headquarters” is being finished. Meanwhile, our space in Recitation will be renovated info offices and meeting space for the Dean of Students In January, we will be moving into our new offices and lab space on the 3rd floor of the Convergence Center on campus. Convergence is part of the public-private partnerships idea that Purdue has been promoting over the last few years. It is a building owned and operated by a private company, located on the university and housing several campus departments as well as industry offices. It was completed in 2020 and presents exciting new possibilities for our next 25 years. Our new space will be bright and airy, with lots of windows. We’ll have more offices, lots of work spaces for students, several labs, and multiple meeting rooms. We’ll also have dedicated space for co-location of researchers of some CERIAS partners (with Sandia National Labs the first such partner). If you want to visit us between August 15 and January 15, let us know and we’ll find a room on campus to meet with you. After the 15th, come visit us at our new offices! Our new address will be: 101 Foundry Dr STE 3000 West Lafayette IN 47906-3446 During and after the transition we expect our phone numbers and email addresses to be unchanged. Our web pages will continue to be active. Our phones will forward to wherever we are during the build-out phase, so you will be able to reach us as always. While we’re at it, mark your calendars for March 28 & 29 — the annual CERIAS Symposium. We’re celebrating our 25th anniversary and you are all invited!

  • The more things change….
    by spaf@purdue.edu (Prof. Spafford) on April 3, 2022 at 3:27 pm

    Last week was our 23rd CERIAS Symposium. It was a great event, thanks to great speakers and lots of behind-the-scenes work by the wonderful staff. We have developed a history of some outstanding presentations and interactions. Next year we will be celebrating the 25th anniversary of the founding of CERIAS (it will be the 24th symposium because we didn’t have one the first year). I hope we can continue the streak of great presentations and events, but given the tremendous community we have, I’m sure that will be the case. During the breaks, I ran into several former students, including one who graduated 28 years ago. I heard wonderful stories about what they’ve been doing in their lives since then, and how their experience at Purdue with COAST and CERIAS helped set them up for success. That is really gratifying to hear; teachers always like some affirmation that they didn’t screw up too badly! I was going to write up a blog post here about that — no doubt prompted by my last post about the workshop 22 years ago — then I vaguely recalled having written something like that a while back. After some looking, I found it in my personal blog (it was before we established this blog): Some Thoughts on Lifetime Achievement. That has mostly aged well, and I could make most of the same general comments today. I continue to be pleased that my former students are happy and productive. And although I am still sure I will be forgotten in 100 years (heck, a lot of people try mightily to forget me today), I am confident that what I helped start as education and awareness in this space will continue to make a difference through the good works of those whose lives we touched here at Purdue. Also, I’m still not done yet. I have 5 Ph.D. students in various phases of completion plus two books underway with ideas for more, and I hope to get all those things finished before I think seriously about voluntary retirement. However, given the state of reality and current events, voluntary may not be the only route…. I may have to spend more time looking through things I wrote over the last 30 years to see how/if some of my thinking has evolved. This makes two items from the archives I had dimly remembered that seem to be relevant now. But I will note that in 11 years I have never found a use for my AARP card that my AAA membership didn’t also provide (e.g., hotel discounts).

  • Who Says You Can’t Predict the Future?
    by spaf@purdue.edu (Prof. Spafford) on March 29, 2022 at 8:13 pm

    While preparing to introduce today’s keynote (Dr. David McGrew) at the 23rd CERIAS Symposium, I was reminded of an exercise in crystal ball gazing. Every December we have various people publish a list of their top predictions for the coming year. Some are thoughtful, and others simply risable. The track record is often quickly forgotten. However, what of an effort by real experts and visionaries to make some bold predictions for a decade hence? Many people have repeatedly claimed that such a thing is impossible for cybersecurity – the field moves too quickly, innovation disrupts truisms, and biases complicate the mix. Here, I present at least one worked example that proves that it could be done – and was. In 1992, the COAST Laboratory was started. Around 1996, Cisco became a corporate partner with COAST, providing equipment and funds for student scholarships. When CERIAS emerged from COAST in May 1998, Cisco stepped up as a founding sponsor. This included not only continuing financial support, but increasing some researcher involvement. In 2000, another CERIAS partner at the time, Accenture, agreed to cosponsor a workshop at their St. Charles conference center. The workshop would be organized by CERIAS and was to focus on making some “bold” predictions for the next decade. We were supposed to identify some “visionaries” who could participate and discuss the future. I (Spaf) identified some personnel I knew were deep thinkers, some of whom were not yet quite widely known in cybersecurity. I invited them, and Accenture added a few of their own senior staff. These people went on to build significant reputations in the field. (I’d like to claim it was because they participated in the workshop.) The visionaries who attended, and their affiliations at the time: Whit Diffie (Sun Microsystems) Becky Bace (Infidel) Howard Schmidt (Microsoft) Phil Venables (Goldman Sachs) David McGrew (Cisco) Dan Geer (@Stake) John Clark (Accenture) Dan Deganutti (Avanade) Glover Ferguson (Accenture) Anatole Gershman (Accenture) Mike Jacobs (NSA) Fred Piper (University of London/Royal Holloway) John Richardson (Intel) Marv Schaefer (BWAP) Spaf (Purdue CERIAS) An impressive group, in hindsight; fairly impressive in 2000, too! I won’t recapitulate the whole workshop report, which you can read if you wish. However, I will summarize what we saw as the top 10 trends for cybersecurity in 2000: The EverNet: Billions of devices proliferate that are always on and always connected. Virtual Business: Complex outsourcing relationships extend trust boundaries beyond recognition. Rules of the Game: Government regulation increases as lawmakers react to real losses that hurt. Wild Wild West: International criminals exploit lack of cooperation and compatibility in international laws. No More Secrets: Privacy concerns will continue to compete with convenience and desire for features. Haste Makes Waste: “Time to Market” increases pressure to sacrifice security and quality of software. Talent Wars: Lack of security skills will compound weaknesses of delivered solutions. Yours, Mine or Ours: Identifying intellectual property and information ownership will become key areas of debate. Web of Trust: Standard security architectures and improved trust will spur eCommerce growth. Information Pollution: Information exploitation becomes more lucrative than hacking. I remember when the report came out it was dismissed by some in industry as “too pesimistic.” Perhaps because the “visionaries” weren’t all well known, the conclusions were largely ignored. Looking back on the list, I’d say we scored at least 90%, especially for the decade that followed. Both #3, and #10 took a little longer to manifest, but we were on target with all ten. You can apply some hindsight bias now to say they were all obvious, but that really wasn’t the case in fall 2000. The iPhone was 6 years away from introduction and the Motorola StarTac CDMA phone was effectively the state-of-the-art. Wireless was basically defined by the recent release of 802.11a/b. Internet penetration was less than 6% of the world’s population (it is over 66% now, in early 2022). At the time of the workshop, Facebook and Twitter were years away from creation, and Google was a small search engine company less than 3 years old. Ransomware had been described theoretically, but would not become prominent for several years. Interestingly, the action items the group defined are still relevant, and notable perhaps in how they are still not practiced widely enough: Improve Software Quality Focus on improving the quality and assurance of software. Prevent distribution of weak software with security exposures. Conduct research to find better methods for designing and developing higher quality software. Invest in Training and Awareness Develop a sound educational program that focuses on security and ethics. Focus resources throughout the educational spectrum. Teach respect for electronic boundaries. Develop comprehensive curriculum to educate our next generation. Implement Best Practices Incorporate baseline safeguards and practices. Use best practices to ensure security is done right in development, implementation, testing, business processes, and consumer practices. Initiate Public Debate Initiate public debate on identification, ownership protection, use of personal information, and responsible use of computing. Advocate Holistic Approach Advocate and pursue a well-rounded and pro- active approach to the overall problems: business, social, technical, and government. Package Security Architectures Encourage packaging of a basic security architectures with standard services that integrate with applications and infrastructure. Group photo (click to enlarge) Update One of the workshop participants informs me that the workshop was held in late September 2000. The report is copyrighted 2001, which is why I thought that is when it was held that year. Unfortunately, I no longer have my appointments calendar from that time so my initial posting indicated 2001. His recollection of this is strong, and is likely correct. I have corrected the dates in the entry above to reflect this correction.

  • It’s been 30 years—time to celebrate!
    by spaf@purdue.edu (Prof. Spafford) on March 2, 2022 at 7:55 pm

    Prolog In 1975, the illustrious Dorothy Denning received her Ph.D. from Purdue’s CS Department. Thereafter, she became an assistant professor, and then associate professor in 1981. Her most notable advisee was Matt Bishop, who graduated with his Ph.D. in 1984. Dorothy initiated a graduate class in cryptography, CS 555, using her book Cryptography and Data Security, around 1980. That class is still taught today (with regular updates), perhaps making it the longest-running cybersecurity class in academia. In 1983, Sam Wagstaff, Jr. (now a professor emeritus) joined the Purdue CS faculty as an expert in cryptography and algorithms. In 1988, Eugene Spafford joined the Purdue CS faculty with expertise in software engineering and distributed systems; Spaf also had a long-standing interest in information security, but not as an academic concentration. (Both Sam and Spaf have taught CS 555 over the years.) Most of the academic research around the world in the 1970s and 1980s into what later became known as “cybersecurity” was focused on formal methods, authentication models, and cryptography. Some security research was secondary to OS security, database, and architecture, but it was not a particularly distinct topic area in classes or academic research. There were only 2 or 3 universities with any identifiable expertise in the overall topic area, outside of cryptography and formal methods of software development. COAST The Cuckoo’s Egg incident in 1986, and the Internet Worm in 1988 helped generate a great deal of interest in more applied security. Spaf had involvement in both, and especially notable in the Worm incident. Subsequent growth of instances of hacking and malware brought increased interest including some funding for research. Early Purdue successes included release of COPS (developed by Dan Farmer under Spaf’s direction), and the publication of Practical Unix Security, co-authored by Spaf and Simson Garfinkel. Both brought attention to Purdue. Increased student interest in computing security coursework and external funding from companies and government agencies led to Spaf and Sam establishing the COAST Laboratory within the CS department in the fall semester of 1991. The CS department provided a room for the lab and student office spaces. Four companies made generous donations to equip the lab initially: Sun Microsystems, Bell Northern Research, Schlumberger, and Hughes Laboratories. The name COAST was suggested by Steve Chapin, one of Spaf’s Ph.D. students. It is an acronym for “Computer Operations, Audit, and Security Tools,” reflecting the more applied focus of the group. Steve was the first Ph.D. graduate from the lab, in 1993. In the next few years, COAST became notable for a number of innovative and groundbreaking projects, including the Tripwire tool, the IDIOT intrusion detection system by Kumar, vulnerability classification work by Aslam and Krsul, the first-ever papers describing software forensics by (individually and as a group) Krsul, Spafford, and Weeber, discovery of the lurking Kerberos 4 encryption flaw by Dole and Lodin, the firewall reference model by Schuba, and the first online (ftp, gopher, and www) repository of cybersecurity tools; a remnant of that repository with many historical artifacts is available online. Many other people also contributed to notable successes, some of whom are noted below. In 1992, COAST began to host a regular seminar series of local and invited speakers. That seminar series continues to this day; there is an archive of talk descriptions (from 1994 onwards) and videos (from late 1999 onwards). The series has featured a veritable “Who’s Who” of people in cybersecurity research, industry, and government. The series continues to attract viewers worldwide, and the entire collection is available for free viewing. Despite the growing interest, in 1997, when Spaf testified before the House Science Committee, there were only three identified academic centers other than at Purdue. Shortly thereafter, continued growth and faculty involvement led to the transformation of COAST into the campus-wide institute CERIAS, in May of 1998. That will be the topic of a later post. As of now, however, congrats to all the people who contributed to the founding and growth of COAST – celebrating its 30th anniversary this academic year! Where are they now? A number of students completed their degrees and worked in COAST, most under the direction of Professor Spafford. Here are a few of them: Steve J. Chapin; PhD; Lead Cyber Security Researcher, Lawrence Livermore National Laboratories. Sandeep Kumar; PhD; Staff Engineer, VMware, CA. Christoph Schuba; PhD; Senior Security Architect, Apple Computer. Ivan Krsul; PhD; President, Arte Xacta (La Paz, Bolivia). Sofie Nystrom; MS; Director General at Norwegian National Security Authority. Saumil Shah; MS; CEO and Founder, Net Square. Aurobindo Sundaram; MS; Head of Information Assurance & Data Protection at RELX. Taimu Aslam; MS; CTO at Broadstone Technologies. Steve Weeber; MS; IP Architect at Windstream Communications. Bryn Dole; MS; Self-employed, and co-founder of both Topix and Blekko. Steve Lodin; MS; Senior Director, IAM and Cybersecurity Operations at Sallie Mae. Mark Crosbie; MS; Dropbox Data Protection Officer. Jai Balasubramaniyan; MS; ColorTokens, Inc. Director of Product Management. Katherine Schikore; MS; Software Developer SAS Institute. Gene Kim; BS; Author, Researcher, Speaker, and co-founder of Tripwire, Inc. Todd O’Boyle; BS; AWS Consultant. Keith Watson; BS; Director of Threat Management. Optiv, Inc. Lucas Nelson; BS; Partner at Lytical Ventures, LLC. Tanya Crosbie; BS; Owner, Giggles & Smiles Photgraphy.

  • Reflecting on 30 years
    by spaf@purdue.edu (Prof. Spafford) on January 14, 2022 at 6:14 pm

    One of my students sent me a weblink (in the story, below). It caused me to reflect a little on the past. Here is some text I shared on a few social media feeds. 30 years ago, when I started COAST (which became CERIAS) at Purdue, we identified a need for personnel trained in information security. There was no academic degree program at the time so we started one. We reached out to over a dozen other universities to help build their programs. Today, many of the existing programs in the US (and some elsewhere) trace back to what we started; they have Purdue grads as their prime movers. Now, a quarter of a century later, look at the #1 best job according to US News. We still have a huge shortfall of people working in the field, but that is a result of many factors, including a “leaky pipeline,” not nearly enough support of students from underrepresented groups (including women), and market failure for secure-by-default systems. I am sure my self of 3 decades ago would be astonished by the growth of the field, yet disappointed that we still have some of these problems. And I would definitely be surprised that CERIAS now has over 120 associated faculty and many hundreds of students involved in research, and a half-dozen degree programs in this space. This is the 30th anniversary of the founding of COAST. I hope I’m around to see what the 50th and beyond hold!

Share Websitecyber