Posted: 07/08/2012 in Uncategorized
Tags: , , ,

by Richard Batka

First it should be said that all power is basically the same. The power that makes things run at work is the same power that runs things at home. At home you use 120V single phase low line power [120 V * 24 AMP = 280 VA or 2.8KVA].
At the data center you use 208V single phase high line power [208 V * 24 AMP = 4992 VA or 4.9 KVA] Pushing higher volts is more energy efficient and can actually deliver more power.
Data centers can be supported with something called three phase power. In this case 208 VOLTS are pushed through 13.8 AMPS x 3 which provides 8611 VA or 8.6 KVA. The use of 3-Phase power is becoming more popular because of its ability to deliver power efficiently.

by Randy Naramore

Internet threats have played a major role in the research and analysis of malware / viruses, causing companies like Symantec to sponsor research and development of malware fighting tools to better equip consumers to protect themselves from these threats. These consumers are better informed of the dangers of the internet and as a result are ever conscious of the hackers who want to steal their information in order to commit fraud. The malware scene is overly mature, while on the other hand its “releases” usually tend to have extremely short lifecycles, and quickly become part of a family of variations. The ones with the longest lifecycles tend to dominate a higher proportion of the Internet’s infected population, and these very same pieces of malware are actually the ones written for gains, be it intellectual or financial ones.

by Udi Shamir

Some HIPS (host based intrusion detection system) are using propriety sandboxes which try to emulate the malware instructions sets and to determined if they are malicious. When the antivirus runs it first verify if the binary is packed, if so it will try to unpack the binary load it to memory and start to disassemble its code. The anti virus vendors receive thousands of binaries each day from joined feeds, deployed honeypots as well receiving suspicious binaries which has being detected by their heuristics engines as Trojan.Generic from their endpoint clients. By receiving suspected binaries they process the new samples on both static and sandboxed environments tracing the binaries api calls as well comparing the binary stubs against database which all ready holds millions of known (signed) binaries segment they inspected before. The signature consist of hashing and entire file scanning which is being compared to a dictionary of known viruses all ready recognized by the antivirus engine. Each antivirus vendor uses its own signature algorithm while most of them uses the same concept. To better understand the process of producing anti virus signature concept I was using ClamAV antivirus engine to produce a very basic signature.

by Bob Johnston

Many sites display advertisements but do not constantly change them. These sites are a bit safer yet do not insulate you from drive-bys. It is just that you are less likely to encounter one. On the other hand, those that constantly present varying advertisements without any interaction by the user present the greatest threat. Envision the user having five URLs defined in their homepage, one of which is the corporate and the four others are a media site delivering local news, state news, national news and international news. Virtually all media sites worldwide present changing advertisements and in this example you see four sites providing a continuous variety of advertisements any one of which likely has been compromised. It is only a matter of time before drive-by malware will strike. A tip to malware examiners who try to determine which site resulted in redirection to the malware site. When it comes to drive-by malware it is often impossible, the reason being that the site responsible was opened in the browser several days prior and the browser nor the system was not shut down. If the source site cannot be identified in close proximity to the date/time of the malware being installed, more than likely it simply cannot be determined.

by Yury Chemerkin

A Twitter timeline collects stream of Tweets listed in real-time order with newest updates are at the top into you will land by view of your homepage.
Types of Tweets: Normal Tweets look like as shown on picture #1 and are represent a message not more than 140 characters by itself that appear on sender’s page and his timeline and on other profile’s timeline who are allowed to be seen updates in order to privacy settings. Not that, it has never been appeared on someone profile until it will be retweeted. Mentions look like as shown on picture 2 and are represent the same message including another’s username preceded by “@” placed I message after one word at least, e.g. “This @yurychemerkin is mention for…” Mentions usually appear on sender’s profile among public tweets or someone timeline if this person is following a sender. In addition, mentions may be found in the recipient’s Mentions and Interactions tabs, which is accessible only by them. As a normal tweets, it has never been appeared on someone profile until this person wrote the message.

by Remus Ho

The filtered traffic shows the communication between the smartphone and WhatsApp server with the “Request” and “Response” packets. Locate the packet that reads “Request:” (Figure 3). This indicates the packet used to send out WhatsApp text messages or data.
These packets are the target information you want to funnel out. Upon inspecting the “Request” packet, the test message “Testing” with the recipient cell phone number were clearly shown (Figure 4). This verifies that it is unwise to send private details (eg. your Social Security number, username, password) over WhatsApp. Large “Request” packets are likely to contain photographs or videos. Upon closer scrutiny, a URL starting with “HTTPS” and ending with “.jpg” is observed (Figure 5). Accessing this URL using a web browser uncovers the photo sent by the user and stored on a server hosted by the WhatsApp developers.

by Mervyn Heng

HTTrack Website Copier is aopen source tool to download an entire website from the Internet locally onto your desktop for offline browsing. It is a Windowssoftware that spawned WebHTTrack, its Linux/Unix/BSDrelease. The tool dumps and mirrors the complete contents of the source website you specify to a local directory by replicating the exact directory structure, files and links.
This is beneficial for a security practitioner who wants to perform offline security testing against a website without impacting the server hosting it .
Install WebHTTrack Ubuntu 10.04 entering the following command in your Terminal.

The $35,000,000,000 Problem
by Drake

To zero in on this instance, the same basic attack method, SQL injection, was used repeatedly. SQL injection isn’t new. Imperva.com estimates that it has been part of 83% of successful attacks since 2005. This has resulted in 312,437,487 data records lost due to hacking with about 262 million records from various breaches including TJMax, RockYou and Heartland. All of these incidents involved SQL injection attacks. Thirty five billion dollars doesn’t seem to have solved this problem, nor all the money that was spent in previous years. Think of all the legal liabilities that leaves open; the technical debt that organizations carry, in terms of unfixed flaws in code, and the losses, documented or otherwise, to consumers.

by Kelly R. Kohl

Rootkits are computer programs that allow a cyber attacker to covertly take control of a computer and utilize the compromised computer to commit crimes. Additionally, rootkits can also be destructive to a compromised computer by deleting information from the hard drive. The purpose of this research study was to evaluate and analyze a range of open source, free, non fee-based rootkit detection programs. This study evaluated existing research on rootkits, how rootkits operate, and which rootkit detection programs have been previously evaluated. Additionally, this study conducted testing on nine open source rootkit detection programs. This study intended to determine how effective the nine open source rootkit detection programs were at detecting and removing a rootkit. Based on the findings of the testing, three of the nine programs performed well at identifying rootkits, and only one program was successful at removing two of the three rootkits. All of the tested programs failed to identify the FU Rootkit. This failure to identify the FU Rootkit is attributed to the evolution of this rootkit, which allowed it to go undetected.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s