[EAS]People and Logic Bombs

pjk pjk at design.eng.yale.edu
Fri Dec 20 15:20:21 EST 2002


Subject:   People and Logic Bombs

(from NewsScan Daily, 20 December 2002)

SAFE & SOUND IN THE CYBER AGE: BY CHEY AND STEPHEN COBB
      In this week's column on computer security issues, Chey and
Stephen  Cobb discuss what happens... WHEN THE LOGIC BOMBS.
                                     .
      Remember that movie, the one where the computer guy gets mad at
the  boss, so he quits his job, but not before creating a secret
program that  later attacks the company¹s computers? In fact, there
have been a bunch of  movies featuring some variant of this plot, and
for good reason: such  things actually happen.
      This week a former system administrator for UBS PaineWebber,
Roger  Duronio, was arraigned in a New Jersey federal court on charges
of  sabotaging two-thirds of the company's computer systems. His
alleged  motive? To undermine the company's stock price and make a
bunch of money in  the process. He is alleged to  have "shorted" over
30,000 shares of UBS  stock prior to unleashing his attack which means
the potential was there to  make 30,000 times the amount by which the
stock dropped when the media got  wind of the attacks. In the recent
stock manipulation case involving  Emulex, shares fell 50 percent.
Based on the trading range of UBS  PaineWebber stock at the time of
Duronio's alleged attack, it is reasonable  to say his profits could
have exceeded half a million dollars.
      The flaw in Duronio's alleged scheme was the obviously
unexpected  ability of UBS PaineWebber to prevent news of the attack
getting out. This  was quite a feat on the company's part because the
logic bombs activated on  about 1,000 of its nearly 1,500 computers
and the malicious programs did  actually delete files. Indeed, the
company says attack cost it $3 million.
      These days, newer forms of malicious programming, such as
viruses and  worms, tend to vie for our attention, but the logic bomb,
dormant code that  is later activated or triggered by specific
circumstances, is one of the  oldest forms of computer attack, dating
back to mainframe days. For  example, in September 1987, Donald
Burleson, a programmer at the Fort  Worth-based insurance company,
USPA, was fired for allegedly being  quarrelsome and difficult to work
with. Two days later, approximately  168,000 vital records erased
themselves from the company's computers.  Burleson was caught after
investigators went back through several years'  worth of system files
and found that, two years before he was fired,  Burleson had planted a
logic bomb that lay dormant until he triggered it on  the day of his
dismissal.
      Burleson became the first person in America to be convicted of 
"harmful access to a computer." This week, the federal grand jury
charged  Duronio with one count of securities fraud and one count of
violating the  Computer Fraud and Abuse Act. If found guilty, Duronio
could be hit with up  to 20 years in prison and fines of more than
$1.25 million. Earlier this  year, Timothy Allen Lloyd was sentenced
to 41 months in prison for leaving  behind malicious programs that
deleted critical data from the servers of  Omega Engineering, a
high-tech measurement company that claimed the cost of  the attack was
$10 million.
      How can companies defend against such attacks? Some executives
may  bridle at our answer, but we think it is the right one: by hiring
the right  people and then treating them right. In other words, this
is a people  problem and so it needs a human solution. All the
technology in the world  is not going to prevent an insider, with
authorized system access and  detailed knowledge of the system, from
planting a logic bomb. There are  some technologies, such as network
surveillance and monitoring programs,  that might detect attempts to
create logic bombs. Integrity checking  software might deflect attacks
from logic bombs. Properly enforced software  development policies and
procedures will make it harder for someone to  plant a logic bomb. But
the bottom line is that a determined insider is  almost impossible to
stop.
      On the other hand, it is fairly easy for other humans to spot a 
disgruntled insider. We've seen numerous cases of insider system abuse
 where the identity of the culprit came as no surprise, at least to 
co-workers, if not supervisors or managers (there's a good chance that
a  supervisor who didn't see it coming wasn't doing his or her job -- 
especially the "visor" part). So, before your company spends money on 
technology to cut down on insider system abuse, take a look at morale
and  working conditions. Talk to the people who have the skills and
access to  mount this sort of attack. And read the landmark 1993 paper
on the subject  by our colleague Dr. Mich Kabay: "Psycho-Social
Factors in the  Implementation of Information Security Policy" (Risks
Digest). You may save  some money and you may even save the company.
      [Chey Cobb, the author of Network Security for Dummies, is an 
independent consultant (www.cheycobb.com) and a former senior
technical  security advisor to the NRO. She can be reached at
chey at patriot.net.  Stephen Cobb, the author of Privacy for Business:
Web Sites and Email, is  Senior VP of Research and Education for
ePrivacy Group  (www.eprivacygroup.com). He can be reached at
scobb at cobb.com.]






More information about the EAS-INFO mailing list