Recently some people working for a client of mine expressed the sentiment that they felt that their business wasn't a target for an actual hacker (as opposed to automated attacks). This despite the fact that they had been attacked on two different occasions in a manner that indicated it was the same (thankfully clueless) attacker. Also, the company in question is doing business in a field that seems especially ripe for the proverbial plucking; a lot of money is being made by virtually every player there. One would think that security would be a bigger issue for these folks, but apparently the message hasn't fully landed everywhere.
This got me thinking about endpoint security and how incredibly understated (and often underestimated) the need for security is on these machines. In many companies it is the largest group of machines in the network, owned and operated by the least technically skilled and security-ignorant users in the company, yet most companies consider the protection of these systems as an afterthought. "Just install AV, Jimmy. That'll do!" they say, and turn back to tweaking their firewalls (if you're lucky).
At the same time, an attacker simply lures the gullible users to a specially crafted malicious website or sends out a mass mailing of an infected PDF. Despite having been told thousands of times before not to open attachments from people you don't know (or that you don't expect), you just know that someone will do it anyway. And really, all it takes is a single user to take leave of their senses to create a backdoor into your network. I would also like to point out, because this thought seems to float around a lot, that no amount of Group Policy settings will change the outcome. What you need is user sensibility and proper endpoint protection.
Considering the above point and observing the evolution of the purpose behind botnet malware, it becomes clear that the shift is financially motivated. A few years ago botnets were used mostly for DDOS purposes, but ever since there has been a change towards monetary gain. From basic DDOS, the botnets were deployed to make money through click-advertisement programs and surfing behavior studies. After that came the stealing of financial information, often leading to credit card fraud, and identity theft. Currently we're seeing the re-emergence of ransomware, where user data is being held hostage until the user pays a certain amount before a deadline. If they don't pay, their data is lost forever.
The criminals involved (often organized crime) seem to be refining their strategy. Where they once made relatively small amounts with a large number of systems they now aim to make a larger amount per system. Essentially they realized that there is a Monetary Value per Owned System, and by becoming more efficient they are raising that value per system to maximize profits.
This idea swam around in my head for a while. What would I do to make the most money? If the idea is to squeeze the most cash out of each system, then we should be looking for the systems that have the most potential cash to be stolen. For me, this ruled out the average internet user. You'd have to be very lucky to stumble onto a rich and clueless target, there just aren't that many around. Also, how would you know that your target is actually wealthy?
The answer was simple: Companies. Companies usually have deeper pockets than the average internet user and the ways to exploit them are myriad: extortion, data theft, corporate espionage, credit card fraud; you name it. There's another upside to this approach: most companies deploy their workstations through imaging. That often means that if one workstation is vulnerable to a certain attack, chances are good that the other workstations in the network are too. More targets mean more potential access to the information I'd want. Also, in most cases the users of said workstations are a lot less motivated to be secure; its not their workstation and its not their money.
Following this logic, the future of corporate security looks grim. Workstations are a hell of a lot more tempting a target than any server; they are easier to crack and there's a lot more of them. Administrators need to realize that attackers (both real and automated) won't attack the shield you hold up, but rather go after the target behind the shield in any way possible. This means that the hard-shell/soft-interior methodology in securing a network is dead, and actually has been so for quite some time.
Endpoint protection will remain the name of the game, and what software vendors are doing right now isn't working. Its a failing approach, something that’s becoming increasingly obvious with each new report of a major breach. A change needs to be made before Organized Crime realizes its full potential.
Have you ever found yourself in a situation where you have been given the task to write a security policy or a procedure? But you don’t want your document to end up like so many others – gathering dust in some forgotten drawer? Here are some thoughts that might help you…
The steps I’m about to present to you are designed based on my experience with various kinds of clients, large and small, government or private, for-profit or non-profit – I find these steps applicable to all of them. Actually, these steps are applicable to any kind of policies and procedures, not only those related to ISO 27001 or BS 25999-2.
1 Study the requirements
First you have to study very carefully various requirements – is there a legislation which requires something to be put in writing? Or maybe a contract with your client? Or some other high level policy that already exists in your organization (perhaps a corporate standard)? And of course the requirements from ISO 27001 or BS 25999-2 if you want to comply to those standards.
2 Take into account the results of your risk assessment
Your risk assessment will determine which issues you have to address in your document, but also to which degree – for instance, you may need to decide whether you will classify your information according to its confidentiality, and if so, whether you need two, three or four levels of confidentiality.
This step may not be relevant in this form if your policy or procedure is not related to information security or business continuity. However, risk management principles are applicable to other areas as well – quality management (ISO 9001), environmental management (ISO 14001), etc. For instance, in ISO 9001 you have to determine to which extent a process is crucial for your quality management and accordingly to decide whether you will document it or not.
3 Optimize and align your document(s)
An important thing to consider is the total number of documents – are you going to write ten 1-page documents or one 10-page document? It is much easier to manage one document, especially if the target group of readers is the same. (Just don’t create a single 100-page document.)
Moreover, you have to be careful to align your document with other documents – the issues you are defining may be already partially defined in another document. In such case, it may not be necessary to write a new document, maybe only expand the existing one.
If you are writing a new document about an issue that is already mentioned in another document, be sure to avoid redundancy – to describe the same issue in both documents. Later it would become a nightmare to maintain those documents; it’s much better that one document makes a reference to another, without repeating the same stuff.
4 Structure your document
You also need to take care that you observe your corporate rules for formatting the document – you already may have a template with pre-defined fonts, headers, footers etc.
If you already implemented ISO 27001 or BS 25999-2 (or any other management standard), you’ll need to observe a procedure for document control – such a procedure defines not only the format of the document, but also the rules for its approval, distribution etc.
5 Write your document
The rule of the thumb is – the smaller the organization and the smaller the risks, the less complex your document will be. There is nothing more useless than deciding to write a lengthy document no one is going to read – you have to understand that reading the document takes time, and the level of one’s attention is inversely proportional to the number of lines in your document.
One good technique to overcome the resistance of other employees to this document (no one likes change, especially if that means something like an obligation to change passwords on a regular basis) is to involve them in writing or commenting this document – this way they will understand why it is necessary.
6 Get your document approved
This step is rather self-evident, but its underlying importance is this – if you are not a high ranking manager in your company, you won’t have the power to enforce this document.
This is why someone with such a position has to understand it, approve it, and actively require its implementation. Sounds easy, but believe me – it is not. This step (and the next one) are the ones where implementation most often fails.
7 Training and awareness of your employees
This step is probably the most important, but sadly it is one that is very often forgotten. As mentioned before, employees are tired of constant changes, and they surely won’t welcome another one especially if it means more work for them.
Therefore, it is very important to explain to your employees why such a policy or procedure is necessary – why it is good not only for the company, but also for themselves.
Sometimes training will be necessary – it would be wrong to assume that everyone possesses the skills to implement new activities. For you, who wrote this document, it may seem easy and self-evident, but for them it may seem like brain surgery.
End of story?
If you thought you’ve reached the end of your document-implementation story, you’re wrong – the journey has just begun. It is not enough to have a perfect policy or procedure that everyone just loves, you also need to maintain it.
Someone has to take care this document is up-to-date and improved, or else no one is going to observe it anymore – and that someone is usually the same person who has written it. Not only that, someone has to measure if such a document has fulfilled its purpose – again, it may be you.
As you may have noticed reading this article, it is not enough to have a nice template for a successful policy or procedure – what is needed is a systematic approach to its implementation. And in doing so do not forget the most important fact: the document is not an end in itself – it is only a tool to enable your activities and processes to run smoothly. Don’t let the opposite happen – that such a document makes these activities and processes run with more difficulty.
Cross posted from ISO 27001 & BS 25999 blog - http://blog.iso27001standard.com
I am very excited to have been invited to participate in the community here at itgrcforum.com. My professional passion is finding the sweet spot where security and compliance work as enablers of the business rather than impediements. I look forward to sharing my thoughts on practical ways that security and compliance professionals become that kind of asset to their business. What follows is the revised version of a piece I wrote in 2010 for my own site, but I believe it will nicely introduce you to my take on security, compliance and business.
I look forward to hearing your thoughts and questions.
Compliance Leads to Security Breaches - Maturing from Compliance to Security
How IT’s compliance mindset would look in another setting:
“Hey, there’s a fly in my soup!” - patron
“Let me take care of that for you sir” – waiter, as he reaches into the soup and pulls out the fly
“Well, the soup looks fine now. Thank you.” – patron, as he digs in
In the world of Information Security, compliance rules with an iron fist. For InfoSec professionals in the health care industry, data must be stored and secured according to HIPAA guidelines. For those in finance, GLBA rules. For those who handle credit card info, PCI-DSS. For all public companies, SOX is king. The specific rules for these industries differ but the consequence of failure to comply is the same across them all. If you do not follow the compliance rules for your industry you will receive fines, and eventually be put out of business.
InfoSec professionals can make a very nice career for themselves by becoming well versed on the specifics of a data protection regulation. Companies spend billions of dollars a year to achieve compliance with the standards governing them. Certainly nobody can blame them for striving to achieve compliance. We cannot do business without it. But does compliance mean we’re secure?
The most high profile hacks in recent history were performed against PCI compliant systems. The Heartland fiasco was performed against a company who could put their check in the correct box on a PCI checklist. That didn’t prevent the breach. Nor the countless others before and since. So what were these companies doing wrong?
When you set your goal at “achieving compliance” whether it’s to PCI, HIPAA, ISO27001 or any other standard, you are settling for “good enough.” You are using someone else’s bare minimum standard for acceptability as your end goal.
Compliance will never bring security. No checklist or audit, regardless of how many agencies approve it, can account for all the ways vulnerabilities can strike in your specific environment. No governing body can foresee the ways your organization will need to defend itself in the future. As long as compliance is your end goal, security will never be achieved. Much like the fly in the soup, your organization may look clean, with some nasty surprises waiting under the surface.
Compliance forces you to forever work in a reactive mode. While the main objectives of our industry standard regulations do not change often, the specific checklist items that auditors are looking for frequently do. As auditors start adding new requirements to their lists you are continually forced to react and build ,or buy, bolt-on solutions that will get you through yet another audit finding, but can’t get to the heart of your vulnerabilities.
Finally, compliance leads to security breaches. Or more accurately, when an organization aims for compliance rather than security, vulnerabilities are the eventual outcome. Data protection standards are notoriously slow in incorporating new safeguards to defend against new hacker techniques. Those who do the bare minimum to achieve compliance will be among the first to become victims of new zero-day attacks. Attackers also know what compliance requires, and so when we simply do the minimum to achieve compliance, those attackers have a fairly detailed outline of what our defenses consist of.
Organizations who focus on security will inherently achieve compliance. If you consider security throughout a systems’ lifecycle, continually run risk assessments internally (formal or informal) and allocate sufficient resources to security initiatives, compliance to regulations is made much less daunting. Drive security into systems as early and as integrally as possible.
Just as we won’t settle for having the fly removed from our soup, we should not settle for security policies that just get us through an audit successfully.
From Network World: "As enterprises approach a high level of maturity in their IT governance, risk and compliance (GRC) programs, they face a conundrum: How can they effectively implement and manage policies and their supporting controls to maintain a strong risk posture? To add to the difficulty, the environments they manage are often widely distributed and subject to multiple regulatory requirements and internal audit requirements, and must adapt to changing business needs. GRC tools are designed to help.
"It's mostly about the maturity of the organization," says Paul Proctor, vice president of security and risk management at Gartner. "Are you ready for a more formalized and automated way of tracking controls? If you have your act together, you should be looking at this."
Note: This post was originally written on September 3rd, 2010 and published on ArgentConsulting.nl.
In the past I've always said that the Dutch government needs to do more in the area of Cyber Warfare / Cyber Security because there didn't seem to be too much going on. Our Defence department didn't post anything about starting up a Cyber Command, nor was there any government activity to be seen. However, though it wasn't easy to find, there does finally appear to be some movement on the horizon.
During a meeting about the 2010 Defence budget, members Knops (CDA), Voordewind (CU) and Eijsink (PVDA) established that there was no mention of Cyber Warfare in the budget. They note that Cyber Warfare is an issue of great concern, and submitted motion 32 123x nr. 66 (in Dutch) to start interdepartemental development of a Cyber Security Strategy and urges The Netherlands to start actively participating in NATO initiatives on the subject.
In a letter by the Minister of Defence (again in Dutch), Eimert van Middelkoop acknowledges that rapid developments in technology have also led to certain threats such as cyber crime and cyber warfare. He describes what is understood by the term Cyber Warfare and how it relates to his department, along with how various other ministries also have responsibilities regarding cyber security issues.
A brief overview:
Minister van Middelkoop asserts that commercial parties also have a role to fulfill in the development and implementation of a cyber security strategy, to which I can only wholeheartedly agree. The next paragraph of this most clarifying letter confirms the existance of the Defence departments' own CERT (DEFCERT), and its responsibilities towards defending its networks. In a separate letter he mentions that DEFCERT is growing and is expected to be fully operational in 2012.
Probably the most important information that can be obtained from this letter is in the final paragraph. It contains The Netherlands' intentions in this area, which resemble those of Great Britain:
Compared to what has been released by the Dutch government on this topic, its a lot of information that suddenly became available. As a concerned Dutch citizen, I am very happy to see that this threat is finally addressed. With the dependency on technology growing every day, cyber security will continue to grow in importance along with it. If we do not work towards creating a safer cyberspace now, the consequences could be dire.
The sheer volume of electronically stored documents (ESI) often seems to obscure the actual business data stored on information systems. Digital forensics and electronic discovery (e-discovery) procedures encompass the full spectrum of digital information. In the legal community, electronic data is known as “Electronically stored information” (ESI). The sheer volume of documents, presentations, spreadsheets and similar electronic analogs of paper documents has spawned a huge need to collate and analyze data. The “paperless” office has, in this sense, produced a blizzard of electronic documents for analysis. In this blizzard of standard format electronic documents, the actual contents of various information systems are often underappreciated. This should not be so. Information systems, whether custom or packaged, are an important source of original raw data about a business. Abstracted documents, whether memoranda or invoices, are derivative forms based upon the raw information.
Recently, I published “Digital Forensics and E-Discovery on OpenVMS,” about how OpenVMS system managers should prepare for the need to deal with requests for digital data, specifically data in formats not understood by mass market-based procedures. This is not an OpenVMS-specific problem; the same problem is found on any computer system using software that is not in the “Top-200” list whose formats are included with major digital forensics and e-discovery packages. The problem exists on all systems: mass market systems including Microsoft’s Windows family, Apple's OS X, and all of the UNIX variants, including Linux as well as enterprise-class systems such as HP's OpenVMS and IBM's z/OS. Many applications across all systems and file systems on the enterprise-class systems are outside the capabilities of standard forensic and e-Discovery packages. This should be unsurprising.
There is an almost limitless population of applications software used today. Some applications are mass-market, appealing to a broad swath of the market. Others are niche applications that may be extremely popular within a particular industry or sector. While these applications may be popular within that industry group, they may be all but unknown outside of it (e.g., Mathworks' MATLAB).
Even more specific is applications software and business systems implemented specifically for an individual enterprise. While many such systems are variations on a theme, presumptions can be extremely misleading. Mass-market systems (e.g. Microsoft's Word and Excel) identify a least common denominator and are aimed at a wide market; their data formats and representations attempt universality and are correspondingly transportable between firms. Software developed for in-house use is contrastingly developed specifically for the needs of the individual organization as they are perceived at the time the software was implemented. This is a significant difference.
Viewed from a business records perspective, the conclusion is almost inescapable: Records stored in a custom format are as relevant as the corresponding records stored in a mass market electronic format (e.g., QuickBooks) or in hardcopy ledger books. Concluding the contrary would be absurd.
It follows that some of the most critical data stored within an organization will be stored in files whose format and organization is not within the decoding powers of standard software suites used in electronic discovery or digital forensics.
Consider the single record illustrated below:
Smith John 11401 33 17
Taken in isolation, it is not possible to determine the meanings of the individual fields within this record. It could be the first and last names of a person (or vice versa), the numeric values could include any number of things from ZIP (Postal) Codes to arbitrary indicators. Without the full context, including applications programs, related data files, and other information, conclusions can be difficult or misleading.
Some of this data is maintained by the underlying operating system; other information is contained within the file, but is not normally visible. Both classes of information are often referred to as “metadata.” Producing information in its native form is intended to preserve all of the metadata associated with the information. Native formats, including metadata, are preferred; a fact recently noted by US District Court Judge Shira A. Scheindlin, in National Day Laborer Organizing Network v. U.S. Immigration and Customs Enforcement Agency[1]
Understanding the meanings of data and metadata can be challenging. The intelligence community is long familiar with this inherent dilemma. One of the most famous examples of which is the message sent to the First Air Fleet commander Vice Admiral Chuichi Nagumo, IJN and other Imperial Japanese Navy commanders prior to the attack on Pearl Harbor:
“NITAKA YAMA NOBORE 1208;”
rendered in English as
“Climb Mt. Nitaka 1208.”
In retrospect, this is clearly the execute order for the opening of hostilities between the Japanese Empire and the United States, a fact noted by a translator more than four years later in 1945. Examined in isolation, without 20/20 hindsight, it gives no hint of its true meaning.[2,3] Alliteratively, climbing the highest mountain in Imperial Japan could be construed to be significant. However, this connection is far clearer in retrospect. As Sigmund Freud is reported to have said, “Sometimes a cigar is just a cigar.”[4,5]
Distinguishing between such custom data and mass-market formats is important. Mass-market applications face a similar problem, but on a different level. Consider Event.doc, a sample document in a standard mass- market format, for example, that created using Microsoft Word:.
Event.doc:
“Janson killed it.”
While the format is completely consistent with a specific version of the Microsoft Word document format, the meaning is far more obscure: This simple sentence could (non-exhaustively) mean:
Without fully understanding the context of the material, the precise meaning is unclear.
None of this affects the reliability or accuracy of in-house developed software. It does its job presumably with an understanding of the precise recording conventions of the data. The problem occurs when looking at the stored data without a thorough (or with a mistaken) understanding of how it is used and what it means. The effect is similar to that of coming upon a new tongue. A language may share elements with related languages, but that is no guarantee that such parallels are consistently reliable across the full breadth of the language.
Such questions frequently arise with regards to information systems. Business data stored in an organization's information systems to be vital in assessing any number of issues: accounting data (from revenues to expenditures); and precise times and locations for individual transactions, are examples. Ensuring that this information can be safeguarded and access preserved should be an important part of IT planning.
As I noted in my OpenVMS Consultant installment, this is a subtle yet very important point. Requirements to produce electronically stored information also create a concomitant need to both understand and preserve the context surrounding the information. The most reliable form of this information is not printed reports or their electronic analog. The raw data in the various files and databases used on a day-to-day basis in the normal course of business is far more detailed and accurate. An example is the difference between a normal mobile phone invoice and the so-called “tower log,” which indicates precisely which towers a cellular telephone used to complete a call.
The distinction is significant. Precisely this type of problem happened in one litigation matter. I was a consultant to the attorneys handling the matter. The central question surrounding assessing damages were the warranty claims recorded in a database. The warranty system was a series of custom programs written specifically to support the business' operations. The defendant in the case claimed that the warranty database was “unreliable.” I then performed a detailed review of the database records to determine the validity of the information stored in the database. In the end, after much research, I was able to account for each of the phenomena that had raised questions, refuting the “unreliability” claims. I have been given to understand that my client was then able to negotiate a favorable settlement. Attorneys and Information Technologists need to cooperate to identify relevant data and then to take steps to ensure that both the raw data and the technological context needed to understand data files is preserved in necessary completeness and with necessary safeguards to protect all interests, both actual parties and otherwise non-involved third parties.
[1] | Shira A. Scheindlin, USDJ (2011, February 7) Opinion and Order, National Day Laborer Organizing Network v. U.S. Immigration and Customs Enforcement Agency, 10 Civ. 3488 |
[2] | Edwin Layton (1985) And I was There... pp 242 |
[3] | Ibid, pp 528 |
[4] | Clifton Fadiman (1985) The Little, Brown book of anecdotes Little, Brown and Company |
[5] | Ashton Applewhite, Tripp Evans, Andrew Frothingham (2003) And I quote: the definitive collection of quotes, sayings, and jokes for the contemporary speechmakers MacMillan, pp 224 |
Reproduced from Electronic Discovery and Digital Forensics: The Applications Front an entry in Ruminations -- An IT Blog by Robert Gezelter. Copyright (c) 2011, Robert Gezelter. Unlimited Reproduction permitted with attribution.
The congressionally appointed Financial Crisis Inquiry Commission released a 535-page report on Thursday blaming the meltdown in part on compliance breakdowns and deficiencies.
The Commission concluded that this crisis was avoidable—the result of human actions, inactions, and misjudgments. Warnings were ignored. The the crisis was caused by:
"Despite the expressed view of many on Wall Street and in Washington that the crisis could not have been foreseen or avoided, there were warning signs. The greatest tragedy would be to accept the refrain that no one could have seen this coming and thus nothing could have been done. If we accept this notion, it will happen again" said Phil Angelides, Chairman of the Commission.
The report was signed by FCIC chairman and former California State Treasurer Phil Angelides, former Florida Governor and Senator Bob Graham, former Commodity Futures Trading Commission Chairman Brooksley Born, Byron Georgiou, Heather Murren and John Thompson. Three Republican appointees – vice chairman and former California Congressman Bill Thomas and former advisers to President George W Bush Keith Hennessey and Douglas Holtz-Eakin – released a 29-page dissent. A fourth GOP dissenter, Peter Wallison, a former Treasury Department general counsel and counsel to President Ronald Reagan, released a separate, 98-page opinion. Please click here for the complete report, including both dissents.
I would like to thank all of our members who provided feedback on the topics you want us to address over 2011. Through the beginning of 2011 we will be running events on PCI Compliance, Cloud Computing, Enterprise 2.0 Compliance, IT Risk Management and Enterprise GRC. Here is a preview:
Please continue to provide feedback in the comments to this post, or directly to myself at This email address is being protected from spambots. You need JavaScript enabled to view it..
Thank you!
Cinthia Pilar
Production Director