Need a security expert? You got to hire a coder!

As security (cyber) becomes more and more important, to businesses, governments, and also to our personal lives, the need for good security engineers and researchers is increasing at a rapid pace.

This is true whether one is working in an entry-level position or is already a senior researcher.

It is often said in the security industry that “It is easier to teach a developer about security than it is to teach a security researcher about development (coding).”

Information security professionals are used to seeing, experiencing and talking about failures in the industry. This usually leads them to assume that badly written (vulnerable) code is always the product of unskilled developers. If these professionals have never been exposed to software development, even at a small scale, then they do not have a fair understanding of the complex challenges that developers face in secure code development. And I think that a security professional cannot be effective in designing detective and preventive security controls (tools, architectures, processes) if he or she doesn’t appreciate these challenges.

Let me illustrate that with an example- ‘code injection” attacks against NoSQL databases versus SQL databases. Simply put, SQL and NoSQL databases both collect, organize and accept queries for information, and so both are exposed to malicious code injections. So, when NoSQL databases became popular, people were quick to predict that NoSQL injection would become as common as SQL injection. Though that is theoretically true, developers know that it’s not that simple.

If you take sometime out understanding NoSQL databases, you will quickly realize that there are a wide variety of query formats, from SQL like queries (Cassandra), to JSON based queries (MongoDB, DynamoDB), and to assembly like queries (Redis). And so security recommendations and tools for a NoSQL environment have to be targeted to the individual server that is underneath. Also, your security testing tools must have the injection attacks that are in the format of that specific database. And so one cannot blindly recommend controls or preventive measures, without understanding that the vulnerabilities are not available on all platforms. Encoding recommendations for data will be specific to the database type as well. This OWASP article explains how one can test for noSQL injection vulnerabilities.

This is all the knowledge that one can learn by digging deep into a subject and experimenting with technologies at a developer level. And so people with development backgrounds can also, often times, give better technical advice.

If one looks at the people leading security programs or initiatives at companies like Apple, Facebook, Google, and other large successful tech companies, many of them are respected because they are also keeping their hands on the keyboards and are speaking from direct knowledge. They not only provide advice and research but also tools and techniques to empower others in the same industry.

So to summarise, I would like to say that whether one is a newly graduated engineer or a senior security professional or a security researcher, one should never lose sight of the code, as that is where it all begins!

 

 

Picture courtesy: http://www.icd10forpt.com

Verizon’s acquisition of Yahoo

TechCrunch just reported that Verizon has acquired Yahoo for $4.83 billion. 

This definitely is a shocker and I am sure many would agree with me. Not many of us were expecting Marrisa Mayer to call it a day by dropping the ball so soon. 

One of the most important companies of the first dot-com boom, Yahoo, has reached the end of its life as an independent company. This deal represents a stunnin decline for a company that was valued at more than $100 billion at its its peak in 2000. 

Marissa’s roots as an engineer at Google, definitely helped in improving the brand value with software programmers and technology users alike, and she did make an effort to beef up Yahoo’s technical talent. She instituted a regorous recruitment process and it worked hard at hiring computer scientists from some of the best universities. But there is little sign that these moves changed the culture at Yahoo or improved morale among the programmers working there. They always saw and projected themselves as a “media company” and not a “technology company”. I am not sure if it played out well for them, as its attempt to be a tech company and a media company at the same time, resulted in an organisation that was less than the sum of its parts. 

I strongly believe that one reason why Verizon was a strong contender was that they have done this before; Verizon acquired another struggling Internet company last year. Like AOL, Yahoo makes a lot of money by creating Internet  content and selling ads against it. So from Verizon’s perspective, this definitely looks like a logical step.

With respect to Mayer’s future at Yahoo, I am sure she is pursuing opportunities outside, as the statement that Yahoo released about this deal, “Yahoo will be integrated with AOL under Marni Walden, EVP and President of the Product Innovation and New Businesses organisation at Verizon”, makes it evident that Marissa Mayer’s future lies outside of Yahoo. 

I wish her all the best, and am sure she will build something very interesting soon in the tech business.



Picture courtesy: TechCrunch.com

Cyber weapons and Nuclear weapons

A good essay pointing out the weird similarities between cyber weapons and nuclear weapons. 

On the surface, the analogy is compelling. Like nuclear weapons, the most powerful cyberweapons — malware capable of permanently damaging critical infrastructure and other key assets of society — are potentially catastrophically destructive, have short delivery times across vast distances, and are nearly impossible to defend against. Moreover, only the most technically competent of states appear capable of wielding cyberweapons to strategic effect right now, creating the temporary illusion of an exclusive cyber club. To some leaders who matured during the nuclear age, these tempting similarities and the pressing nature of the strategic cyberthreat provide firm justification to use nuclear deterrence strategies in cyberspace. Indeed, Cold War-style cyberdeterrence is one of the foundational cornerstones of the 2015 U.S. Department of Defense Cyber Strategy.

However, dive a little deeper and the analogy becomes decidedly less convincing. At the present time, strategic cyberweapons simply do not share the three main deterrent characteristics of nuclear weapons: the sheer destructiveness of a single weapon, the assuredness of that destruction, and a broad debate over the use of such weapons.

Questions to ask before you get your first Threat Intel data source

Anton Chuvakin (one of the leading Gartner experts in the Threat Detection space) had a recent blog post on some of the key questions one must ask while identifying the first threat Intel data source. 

Here is the list

  • What is the my primary motivation for getting TI, such as better threat detection, improved alert triage or IR support?
  • Where do I get my first threat intel source [likely, a network indicator feed, IP/DNS/URL]?
  • How do I pick the best one(s) for me?
  • Where do I put it, into what tool?
  • How do I actually make sure it will be useful in that tool?
  • What has to happen with the intelligence data in that tool, what correlation and analysis?
  • What specifically do I match TI against, which logs, traffic, alerts?
  • What you have to do with the results of such matching? Who will see them? How fast?
  • How to I assure that the results of matching are legitimate and useful?
  • What do I do with false or non-actionable matches?
  • How do I use intel to validate alerts producted by other tools?
  • Do I match TI to only current data or also to past log/traffic data? How far in the past do I go?

The post is worth a read, as he has linked his earlier posts on this topic in this blog post. Do note that the white papers he has has linked requires GTP access. 

A great list of curated Threat Intel resources

I recently found this Github Repo, put together by Herman Slatman, which consists of a list of very useful and curated Threat Intelligence resources.

The list is broken down into following five categories:

  • Sources
  • Formats
  • Frameworks
  • Tools
  • Research, Standards & Books

This is a great resource for anybody starting to dwell into the Threat Intelligence discovery, consumption and classification, as it is an ocean out there, and a lot of these “Indicators” can be noise.

 

Picture Courtesy: depositphotos.com

Interesting Data Science projects of 2015

Here is a list of some really interesting Data Science projects of 2015. Thanks to Jeff Leek from @simplystatistics for putting this together. 
Some of my picks from the list are:

* I’m excited about the new R Consortiumand the idea of having more organizations that support folks in the R community.

* Emma Pierson’s blog and writeups in various national level news outlets continue to impress. I thought this oneon changing the incentives for sexual assault surveys was particularly interesting/good.

* As usual Philip Guo was producing gold over on his blog. I appreciate this piece on twelve tips for data driven research.

* I am really excited about the new field of adaptive data analysis. Basically understanding how we can let people be “real data analysts” and still get reasonable estimates at the end of the day. This paper from Cynthia Dwork and co was one of the initial salvos that came out this year.

* Karl Broman’s post on why reproducibility is hard is a great introduction to the real issues in making data analyses reproducible.

* Datacamp incorporated Python into their platform. The idea of interactive education for R/Python/Data Science is a very cool one and has tons of potential.

Picture Courtesy: kdnuggets.com

Adopting OODA Loop in Intrusion Detection & Response – it’s more than speed

Here is a great post by Richard able, on the concept of using OODA loop in Intrusion Detection and Response.

I have included some interesting lines here:

It is not absolute speed that counts; it is the relative tempo or a variety in rhythm that counts. Changing OODA speed becomes part of denying a pattern to be recognized…

The way to play the game of interaction and isolation is [for our side] to spontaneously generate new mental images that match up with an unfolding world of uncertainty and change…

Why we must encrypt

Bruce Schneier wrote an interesting post on his blog, about encryption, recently. This one is targeted towards the masses and so it touches upon the basics. He starts of by introducing the fundamental reasons for the use of encryption. But he also highlights some interesting facts about the concept of Encryption.

If we only use encryption when we’re working with important data, then encryption signals that data’s importance. If only dissidents use encryption in a country, that country’s authorities have an easy way of identifying them. But if everyone uses it all of the time, encryption ceases to be a signal. But if everyone uses it all of the time, encryption ceases to be a signal. No one can distinguish simple chatting from deeply private conversation. The government can’t tell the dissidents from the rest of the population. Every time you use encryption, you’re protecting someone who needs to use it to stay alive.

Encryption is the most powerful technology tool we have that can help us protect our privacy against cyber adversaries and also against the surveillance programs run by governments. And as Bruce also points out, the relevance of encryption has become the most, in today’s world, as countries like the US, UK, China and Russia are either talking about or implementing policies that limit strong encryption.

Here is a report which is the result of a collaboration between Privacy International, ARTICLE 19, and the International Human Rights Clinic (IHRC) at Harvard Law School. It explores the impact of measures to restrict online encryption and anonymity in four particular countries – the United Kingdom, Morocco, Pakistan and South Korea. It is a definite read.

Title Image courtesy: carra-lucia-ltd.co.uk

Harvesting Value from Open Data

On one side, we’re talking about Data Privacy, User Privacy, and legality of Survellience itself, but at the same time, there is Data that is supposed to be Public information and easily accessible by Human Beings, and also Computers, to process and take value out of it.

Just to set the context of this whole topic, here is a very interesting and extremely powerful use case that talks about a Dashboard created by the Open Data Analytics company Appallicious, which is being billed as a solution that pairs local disaster response resources with open data, and offers citizens real-time developments and status updates.

@jasonshueh has an interesting post on GovTech about the methods that could be used to harvest value out of Open Data repositories, for more such use cases.

Sunlight Foundation is a Washington, D.C. based non-profit advocacy group promoting open and transparent government. According to the foundation’s California Open Data Handbook, data must first be both “technically open” and “legally open.”

  • Technically open: [data] available in a machine-readable standard format, which means it can be retrieved and meaningfully processed by a computer application.
  • Legally open: [data] explicitly licensed in a way that permits commercial and non-commercial use and re-use without restrictions.

I think Junar is doing some interesting work in this area. And I especially liked these lines by Diego May, co-founder and CEO of Junar, in the article

What we see today is that the real innovation is not necessarily coming from hackathons, but now it’s about working with companies or entrepreneurs to solve problems

University of Massachusetts Boston is also doing some interesting work in this area and also the Fraunhofer Society in Berlin are doing some great research in this space.

This (Open Data Analytics) and the relevance of Security in it, is going to be one of the interesting areas in the Data Analytics space.