A social engineering project in a computer security course.
Endicott-Popovsky, Barbara ; Lockwood, Diane L.
ABSTRACT
A small private university began to offer undergraduate and
graduate courses in computer security during the academic year 2002-2003
within the schools of computer science and business. In the introductory
computer security course, a "social engineering" team project
was included as a required assignment. This article briefly summarizes
the social engineering literature, describes the project assignment and
learning objective, provides actual student sample deliverables, and
presents results of a follow-up student survey on the experience. The
lessons learned from this effort should prove useful to other
universities and instructors contemplating similar coursework.
INTRODUCTION
A woman, ostensibly from the human resources department, calls the
company help desk and says she has forgotten her password. In a panic,
she adds that if she misses the deadline to submit employee insurance
applications online, all employees will be without health insurance
until the problem can be corrected, adding that she might even be fired
for this. The help desk worker feels sorry for her and quickly resets
the password--unwittingly giving a hacker entrance into the corporate
network. The hacker got the names of human resources employees from the
company's recycling bin the previous night. This caper is known as
social engineering. Social engineering is basically pulling a con job to
get information or access to systems that are normally only used by
privileged users (Mitnick, 2002). Social engineering is the human side
(i.e., "wetware" in hacker slang) of breaking into a corporate
network. Organizations with elaborate firewalls, authentication processes, virus scan software, and network security monitoring
technology are "still open to an attack if an employee unwittingly
gives away key information in an email, by answering question over the
phone with someone they don't know," by not shredding sensitive documents, or even talking about a project with coworkers at a
restaurant (Gaudin, 2002b).
Kevin Mitnick, the famous convicted computer hacker, offered advice
to businesses afraid that corporate spies and hackers may gain access to
their internal systems using social engineering saying that "on the
corporate side, as an employee, it all comes down to user awareness and
education (Savage, 2003)."
Courses in computer security predominantly discuss the technical
side of security (e.g., encryption, network security defenses,
firewalls, software reliability, digital certificates, wireless
eavesdropping, biometrics.), but often give short shrift to the human
side of security--especially social engineering. The purpose of this
article is to describe a social engineering student project that was
undertaken to increase student awareness of this serious security
vulnerability. The lessons learned from this effort should prove useful
to other universities and instructors contemplating similar coursework
(Vaughn & Boggess, 1999).
DESCRIPTION OF SOCIAL ENGINEERING ASSIGNMENT
Students in a graduate MBA business class on Computer Security were
given a reading assignment from Kevin Mitnick's book, The Art of
Deception (Mitnick, 2002), to learn what is meant by social engineering.
With that background, they were asked to develop an exploit, using
information gleaned from any open source (e.g., including telephone
directories, dumpsters, waste baskets, online information, and any other
publicly available information), against some specific target person on
campus. They were prohibited from actually impersonating anyone like
campus police since impersonating a law enforcement official is
considered a criminal offense. They were also prohibited from contacting
the target "mark" directly, or actually executing their
exploit.
To bound and control this assignment, student activities were
confined to local campus personnel and campus security was informed to
prevent any misunderstandings. Students were instructed to carry a copy
of their assignment (see Appendix A) at all times in the event they were
confronted; however, they were warned that getting caught would result
in a significant deduction of points! The "target mark" was
not to be contacted about the exploit nor the information gained. In
addition, all confidential information was to be deleted (lined out)
from the final assignment and any confidential documents discovered were
to be submitted along with their assignment so that the instructor could
supervise proper disposal.
Please note that this assignment was set up as an object lesson in
ethics, as well. The ethical dimensions of social engineering were
discussed extensively before the experiment was introduced and students
were asked to sign a lab policy statement that acknowledged their
understanding of the "acceptable use" constraints established
for this and other experiments conducted in class (See Appendix B).
Students were informed that the campus security office was acquainted
with the experiment and the constraints under which it would be
conducted. Results were monitored and all materials discovered during
this activity were collected and destroyed. The instructors took care
that the experiment was under their control and supervision.
The deliverable for this assignment was an oral and written report
that described the information obtained, from what source(s), and how
they could use the information discovered through these covert means to
exploit their target. To be certain that students stayed within the
bounds of the assignment, the instructor required progress reports
weekly until the due date for the assignment. They were given four weeks
to complete the exercise. The major objective of the assignment was to
increase student awareness of the human side of computer security
vulnerabilities. Two sample student team deliverables are presented in
Appendix C.
SAMPLE ASSIGNMENTS
In one assignment, the student team interviewed administrative
assistants on campus, claiming to be a sociology student doing a class
assignment on how people constructed passwords. They expressly stated
that they did NOT want participants to reveal their actual passwords;
rather they were to describe the algorithm used to derive their
passwords. For example, one respondent indicated that she used the
abbreviations of various states, of course excluding any that were over
six characters long. Obviously, it would not be hard for a hacker to
subsequently crack their password using this information.
Another enterprising group of graduate students had an even more
elaborate plan. They gained enough information on a professor in the
department to steal her identity and devise a plan to apply for a
tenured professor's position at a prestigious university, assuming
her credentials. They found the "easy" documents (e.g., her
professional resume, list of publications, etc. from the school's
website), but more alarming was their ability to get official school
transcripts, manufacture a letter of recommendation from her department
head, and research sufficient confidential, personal information to pose
successfully as her.
A third group raised a red flag when they reported that deeds of
trust were recorded in our county, complete with social security
numbers! That information about one of the school's faculty
members, plus a traffic ticket that the target had discarded in a
wastebasket-unshredded (traffic tickets contain the driver's
license number of the person cited), is enough to get a replacement
driver's license and to open credit card accounts, which is exactly
what their exploit suggested they would do! Their specific exploit is
described in Appendix C. Note that it is written in the first person, as
the hacker! They have come up with three different exploits they could
run with the information they have received, generally making Mr.
X's life miserable! The oral and written assignments that student
gave in class were graded using the instructor evaluation forms found in
Appendix D.
STUDENT FOLLOW-UP SURVEY RESULTS
Students in the class were asked two questions in a follow-up
survey taken after the Social Engineering assignment was turned in.
1. To what extent did the social engineering assignment increase
your awareness of the human side of computer security? (Circle One)
None Very little Somewhat A lot
1 2 3 4
2a. To what extent has the social engineering assignment changed your
behavior concerning computer security (e.g., " I change my password
more frequently; I'm more careful about shredding sensitive
documents," etc.)
None Very little Somewhat A lot
1 2 3 4
2b. In what specific way(s), did you change your behavior (please
describe):
In response to the first question, students indicated that the
assignment increased their awareness of computer security "A
lot." (avg, = 3.8). However, in response to the second question,
the students indicated that the assignment actually changed their
behavior concerning computer security between "A lot" and
"Somewhat" (avg, = 3.5). The result that reported
"behavioral" changes lagged "awareness" changes
could be explained by the fact that a few students already had work
experience or training in computer security and that they were already
practicing secure "behaviors;" hence the assignment did little
to change their subsequent behavior.
Among the ways students claimed they changed their behaviors
included buying a shredder and shredding anything with their names on it
before discarding; continuously checking for security updates for their
PC software; taking more care with their computer passwords and computer
access, and hardening their personal PC's with firewall, anti-virus
and authentication software. A small number of students indicated they
already were taking precautions with their computer systems at home, but
for most, this exercise was eye-opening. One student even indicated that
they had since initiated procedures at work to properly handle the
computer of an employee who was terminated, another indicated they were
much more sensitive to cold calls on the telephone from people
interested in getting personal information over the phone.
SUMMARY AND CONCLUSIONS
This article described a social engineering assignment that can be
used to increase student awareness of, and change their personal
behavior toward, the human side of computer security vulnerabilities. A
forthcoming paper (Part II) will provide a much more developed pre-post
test questionnaire and analysis.
A similar social engineering assignment could also be given in
actual computer security training programs within organizations that are
trying to increase security awareness. After all, even Kevin Mitnick
(2002) agrees that the best way to prevent such attacks is to increase
employee awareness through training and education followed up by
unannounced fire drills or penetration tests. Results of these tests
should then be communicated to all employees and rolled into on-going
training sessions on computer security.
APPENDIX A: SOCIAL ENGINEERING ASSIGNMENT SCHOOL OF BUSINESS
ASSIGNMENT 1 SOCIAL ENGINEERING ECIS-591 COMPUTER SECURITY
INSTRUCTOR: BARBARA ENDICOTT-POPOVSKY
ASSIGNMENT DESCRIPTION
You will be working in teams of 3 to 5 students. You will be
gathering enough information to be able to impersonate some person on
campus in a social engineering exploit:
Name: Office number, office phone number: Department, Department
Manager:
Something personal like a meeting they attend this week, something
that gives someone the sense that you are the person you are
impersonating, etc. You can use any open source-telephone books,
dumpsters, waste baskets, online information.
YOU MUST NOT: Call or talk to anyone other than team members
Impersonate anyone to get this material (Impersonating a police office
is a crime!) Contact the individual you are targeting
APPENDIX B ECIS591 COMPUTER LAB USE POLICY
1. The following Seattle University IT policy will be in effect,
which can be found at: http://www.seattleu.edu/it
2. Seattle University acceptable computer use policy will be in
effect, which can be found at:
http://www.seattleu.edu/it/policies/default.asp
3. The ECIS591 General Responsibilities for All Students policy
will be in effect can be found on the first page of this syllabus.
4. Due to the special nature of Computer Security certain additions
to the above policies will be in effect, they are the following:
Where appropriate, every experiment run in conjunction with this
course will have certain rules and regulations regarding its conduct.
These will be explained when the assignments are given and students are
expected to comply with any additional restrictions.
To the extent that classroom computers are used to stage attacks
under controlled circumstances, they will be physically disconnected
from all external networks. All student users must maintain this lack of
connection and must verify this lack of connection (with instructor
help) before running any malicious code or exploit.
Students may be allowed to attempt to run harmful software and
obtain root access on classroom computers isolated from the network, as
long as the students in question agree to fix any problems they cause
(e.g. hardware damaging code).
Security flaws and other problems should be pointed out immediately
to the instructor.
Any student running an exploit in connection with assignments in
this class must file an Exploit Approval form with the instructor,
before running any malicious code or attempting any exploit on any
classroom computer.
Students are responsible for the consequences of any actions they
take without the knowledge of the lab instructor.
I hereby certify that I have read and understand this policy and
the relevant Seattle University policies referenced above regarding
computer lab use and will abide by them.
Signature: Printed Name: Student ID No: Date:
APPENDIX C SOCIAL ENGINEERING EXAMPLE ASSIGNMENT
This is an example of several exploits that Group 3 developed after
getting the social security number of the target. Note: this exploit was
NOT actually carried out; only a frightening scenario of what could
actually happen.
THE EXPLOIT
We, the social engineering experts of the MBA 591 Computer Security
class exploited Mr. X in the following ways:
SCENARIO 1: IDENTITY THEFT
We applied for a credit card in the name of Mr. X, using his SSN,
birth date, address, and a fabricated mother's maiden name. (It
didn't matter that our request for credit was not approved; the
information we submitted was added to Mr. X's credit report because
the credit agencies merge everything into the database entry of the
target person based on matching name and address).
With that and our dumpster diving we had enough information in
place to obtain a credit report. We used the address and fax number of a
non-franchise postal services shop called Mailmart, which has a physical
address instead of a P.O. box, and had the clerk forward our mail on a
weekly basis, as a precaution, to the address of a neighbor in the
neighborhood who has a mailbox in a location that's not visible
from the street. I know this person's work schedule and check it
before the rightful owner does.
We got a credit-reporting agency to give us "our" credit
report. We were actually able to obtain the credit report through a Web
interface. We didn't even need our mailing address!
Right away we were able to examine the credit report. It has the
target's actual ("primary") social security number and
other information including other cards, accounts, assets and credit
history. We were able to stock our meth lab and party for the rest of
the month. Based on "his" purchase records Mr. X may expect a
little visit from the Seattle Police Department--I would love to see him
try to explain this one.
SCENARIO 2: THEFT & DEFACEMENT OF SCHOOL PROPERTY
We called the Help Desk and assuming the target's identity
requested a change of password. The user IDs at SU are all first 7
characters of last name, first initial. I anticipated they might ask for
my faculty ID number, (which I had after calling the Registrar's
office, and asking for "my" faculty ID number), but they
didn't even ask for that--they reset it to "password" for
me--"Oh, and by the way," I said, "could you reset my SU
Online password it was the same as my other one." They obliged and
I was off to the races.
I logged into the faculty portion of SU online--I didn't
change everyone else in Mr.X's class grades too dramatically, a B+
became an A-, a C became a B, but my C became a straight A for
"Access".
SCENARIO 3: DEFAMATION OF CHARACTER
Let's just say I don't agree with Mr. X's politics.
I successfully changed "my" password for the second time a
couple weeks later at the Help Desk; this time they double-checked by
asking for my SSN, which of course was no problem. I logged into
"my" email account, and sent a bulk email to all the members
of the 12th Avenue Parking Workgroup. I didn't want it to be
deleted as spam so I started out in a way that would be mistaken for Mr.
X but that got more and more insulting and vulgar.
I didn't have to imagine the shock and dismay of the 12th
Avenue Parking Workgroup because many responded to my email right away!
Good luck with the re-election Mr. X!
APPENDIX D ORAL AND WRITTEN GRADING CRITERIA
REFERENCES
Arbaugh, B. (Feb 2002). "Security: technical, social, and
legal challenges." Computer, v35n2,p109-11.
Castellvccio, M. (Dec 2002). "Social engineering 101."
Strategic Finance, v84,n6, p. 57-58.
CERT Incident Note IN-2002-03 (March 19, 2002). "Social
engineering attacks via irc and instant messaging."
http://www.cert.org/incident_notes/IN-2002-03.html
"Electronic highwayman--a modern myth?" (June 1997).
Anonymous author. Accountancy, v119n1246, p. 49.
Garceau, L. (Feb 1997). "The threat of social
engineering." Ohio CPA Journal, v56n1, p. 42-44.
Gaudin, S. (July 24, 2002). "Tough computer crime bill clears
hurdle." www.internetnews.com/bus-news/article.php/1107521
Gaudin, S. (July 24, 2002). "Social engineering: the human
side of hacking." cin.earthweb.com
Hayes, F. (July 29, 2002). "Can't buy security."
Computerworld, v36n31, p. 54.
Lemos, R. (July 24, 2002). "Mitnick teaches social
engineering." ZDNet News US at
http://news.zdnet.co.uk/story/0,,s2080227,oo.html
McClure, S. & Scambray, J. (May 3, 1999). "When looking
for holes in network security, first give a call to your helpdesk and
users." Infoworld, v21n18, p. 48.
Mitnick, K.D. & Simon, W.L. (2002). The Art of Deception. New
York: Wiley Publishing.
Savage, M. (Apr 28, 2003). "Former hacker Mitnick details the
threat of social engineering." CRN, no.1043, p. 58.
Schneier, B. (2000). Secrets and Lies: Digital Security in a
Networked World. New York: Wiley Publishing.
Schwartau, W. (Feb 2, 1998). "Anatomy of a friendly
hack." Network World, v15n5, p. 35-41.
Taubes, G. (Mar/Apr 1996). "The code snatchers."
Worldbusiness, v2n2, p. 44-45.
Ulfelder, S. (Aug 11, 1997). "Spycatcher." Computerworld,
v31n32, p. 80-81.
Vaughn, R. Jr. & Boggess, J. (Dec 30, 1999) "Integration
of computer security into the software engineering and computer science
programs." Journal of Systems and Software, v49,n2/3, p149-53.
Student Social Engineering Assignments were graded according to rubrics
presented in this Appendix. The written report earned a percentage
grade as follows:
Thoroughness of preparation 30%
Depth of research 25%
Quality of exploit 25%
Class presentation 10%
Documentation 10%
TOTAL 100%
The oral presentations were graded using this additional rubric:
Clear grasp of major issues 20%
Quality of presentation 20%
Appropriate analysis, evaluation 20%
Demonstrated ability to employ course concepts 20%
Organization and logical flow 20%
TOTAL 100%
Barbara Endicott-Popovsky, Seattle University
Diane L. Lockwood, Seattle University