Why we must encrypt

Bruce Schneier wrote an interesting post on his blog, about encryption, recently. This one is targeted towards the masses and so it touches upon the basics. He starts of by introducing the fundamental reasons for the use of encryption. But he also highlights some interesting facts about the concept of Encryption.

If we only use encryption when we’re working with important data, then encryption signals that data’s importance. If only dissidents use encryption in a country, that country’s authorities have an easy way of identifying them. But if everyone uses it all of the time, encryption ceases to be a signal. But if everyone uses it all of the time, encryption ceases to be a signal. No one can distinguish simple chatting from deeply private conversation. The government can’t tell the dissidents from the rest of the population. Every time you use encryption, you’re protecting someone who needs to use it to stay alive.

Encryption is the most powerful technology tool we have that can help us protect our privacy against cyber adversaries and also against the surveillance programs run by governments. And as Bruce also points out, the relevance of encryption has become the most, in today’s world, as countries like the US, UK, China and Russia are either talking about or implementing policies that limit strong encryption.

Here is a report which is the result of a collaboration between Privacy International, ARTICLE 19, and the International Human Rights Clinic (IHRC) at Harvard Law School. It explores the impact of measures to restrict online encryption and anonymity in four particular countries – the United Kingdom, Morocco, Pakistan and South Korea. It is a definite read.

Title Image courtesy: carra-lucia-ltd.co.uk

Harvesting Value from Open Data

On one side, we’re talking about Data Privacy, User Privacy, and legality of Survellience itself, but at the same time, there is Data that is supposed to be Public information and easily accessible by Human Beings, and also Computers, to process and take value out of it.

Just to set the context of this whole topic, here is a very interesting and extremely powerful use case that talks about a Dashboard created by the Open Data Analytics company Appallicious, which is being billed as a solution that pairs local disaster response resources with open data, and offers citizens real-time developments and status updates.

@jasonshueh has an interesting post on GovTech about the methods that could be used to harvest value out of Open Data repositories, for more such use cases.

Sunlight Foundation is a Washington, D.C. based non-profit advocacy group promoting open and transparent government. According to the foundation’s California Open Data Handbook, data must first be both “technically open” and “legally open.”

  • Technically open: [data] available in a machine-readable standard format, which means it can be retrieved and meaningfully processed by a computer application.
  • Legally open: [data] explicitly licensed in a way that permits commercial and non-commercial use and re-use without restrictions.

I think Junar is doing some interesting work in this area. And I especially liked these lines by Diego May, co-founder and CEO of Junar, in the article

What we see today is that the real innovation is not necessarily coming from hackathons, but now it’s about working with companies or entrepreneurs to solve problems

University of Massachusetts Boston is also doing some interesting work in this area and also the Fraunhofer Society in Berlin are doing some great research in this space.

This (Open Data Analytics) and the relevance of Security in it, is going to be one of the interesting areas in the Data Analytics space.

NetFlow-based security tool for Incident Response

Charles Herring of Lancope has a short but interesting post on how NetFlow data can be leveraged for Incident Response purposes.

He says

The collection and analysis of network metadata, such as NetFlow, is an effective way to identify advanced attacks, insider threats or data exfiltration.

There are three major features/activities required by an effective NetFlow management tool:

  • Deduplicate the flow to remove redundant information
  • Directionality to determine the relationship between flow endpoints
  • Robust Querying capabilities

There is a Part 2 coming up soon, which will focus on the Analytics aspects of this.

Title Image courtesy: jimjansen.blogspot.com

Identifying actionable threat intelligence

Ran Mosessco from Websense Security Labs has a very interesting post on solving a key issue every Security Analysts in a SOC (Security Operations Center) faces – the overwhelming amount of security alerts (even after correlation), also called Attack Indicators, an Analyst has to acknowledge and investigate.

Actionable threat intelligence is buried deep within terabytes of seemingly interesting but irrelevant data. Plausible deniability, false positives, lack of traceability and attribution, skillful attackers, adaptation of warfare techniques, and the like only add to the confusion. How does one bubble up prioritized, actionable threat intelligence in an automated fashion from the depths of the data morass?

This approach is still at a nascent stage and requires further study and we need to come up with an implementable solution. But I think this is a good place to start, and the following lines capture the way forward, accurately:

With attacks becoming more advanced and sophisticated each day, combining big data engineering, unsupervised machine learning, global threat intelligence and cybersecurity know-how is required to deal with them in a timely, automated and efficient manner.


This topic is one of my key focus areas professionally, and so I will be writing more about it here. 

Title Image credit: communities.websense.com

Analysis of China-based APT “Deputy Dog” by FireEye and Microsoft TI teams

FireEye has just released an interesting report on the obfuscation techniques used by China-based APT “Deputy Dog”. The FireEye TI (Threat Intelligence) team reportedly found suspicious activity on Microsoft’s TechNet site, early last year, which appeared to have been related to the BLACKCOFEE malware, a malware supposedly employed by the same group in China.

In late 2014, FireEye Threat Intelligence and the Microsoft Threat Intelligence Center discovered a new Command-and-Control (CnC) obfuscation tactic on Microsoft’s TechNet web portal—a valuable web resource for IT professionals.

The threat group took advantage of the ability to create profiles and post in forums to post encoded C2 for use with a variant of the malware BLACKCOFFEE. This technique can make it difficult for network security professionals to determine the true location of the CnC, and allow the CnC infrastructure to remain active for a longer period of time. TechNet’s security was in no way compromised by this tactic.

Here is a representation of the technique by the FireEye team:

Screen Shot 2015-05-14 at 9.00.09 pm

This is a really smart way of fetching and using the C&C IP address, by the attacker, and detecting this communication is going to be a bit tricky and interesting, and so the adversaries will use these obfuscation techniques more often.

The FireEye team has also shared the Indicators of compromise for this, on Github, which will come in very handy to tune our detection rules.

iPhone to become a key tool in genetic studies 

@AntonioRegalado of MIT Tech Review reports this. 

This is an interesting move by Apple, and I see ResearchKit playing a key role in the healthcare industry’s adoption of Technology to spearhead research initiatives.  

Apple is collaborating with U.S. researchers to launch apps that would offer some iPhone owners the chance to get their DNA tested, many of them for the first time, according to people familiar with the plans.

The apps are based on ResearchKit, a software platform Apple introduced in March that helps hospitals or scientists run medical studies on iPhones by collecting data from the devices’ sensors or through surveys.

The importance of security in IoT

Wikipedia’s definition of IoT is:

The Internet of Things (IoT) is the network of physical objects or “things” embedded with electronics, software, sensors and connectivity to enable it to achieve greater value and service by exchanging data with the manufacturer, operator and/or other connected devices. Each thing is uniquely identifiable through its embedded computing system but is able to interoperate within the existing Internet infrastructure.

To put it in even simpler words, IoT depicts a world where objects communicate with each other, and the same objects with humans too, seamlessly.

IoT is such an important area of focus today, that there is also a Search Engine for IoT found here which provides a geographical index of where things (IoT) are, who owns them, and how and why they are used.

The below graph (Courtesy Verizon DBIR 2015) shows the scale of growth of IoT devices in the next five years.

B2B Internet of Things connections, 2011 to 2020 (forecast)
B2B Internet of Things connections, 2011 to 2020 (forecast)

There was this funny definition of “Big Data” that was trending on Twitter recently, and I found it to be quite true. Big Data has become one of the most popular terms used by IT professionals, Businesses, Product Companies and individuals who have anything to do with data or information. But only few actually understand and use this concept and the relevant tools, in the right places. Product companies have been using “Big Data” as a key Marketing jargon.

Similarly, “IoT” is becoming one of the widely used terms in the Tech. and Non-Tech. industry. There are conferences held on IoT, there are marketing initiatives running in full swing in this domain, and every company is in a rush to introduce products in this category.

Following Infographic captures the already prevalent impact of IoT on our lives (Image source: http://cdn.xenlife.com.au):

Source: http://cdn.xenlife.com.au
Impact of IoT in our daily lives

But very few people, companies and institutions are actually spending time and effort in understanding the big picture, and studying and discussing the larger implications of IoT on the industry, our daily lives, and our society as a whole, and building products and solutions around them.

The International Journal of Computer Trends and Technology is one of such institutions which has been doing some research in this area. Their paper An Algorithmic Framework Security Model for Internet of Things is a definite read, and it describes one of the approaches that can be used to understand and implement IoT technologies without affecting security, privacy and integrity of information.

These lines set the context for the whole situation, and the paper:

The biggest role researchers are obliged to undertake is to find and advance the best algorithms for enhancing secure use of Internet of Things especially cutting across different application environments.

The basis of coming up with a security model for Internet of things (IoT) is on the understanding of the source of concern from the functionality modalities of Internet of Things. The functional modalities hereby refer to the different application environments where IoT are applicable, such as health, agriculture, retail, transport and communication, the environments both virtual and physical as well as many other potential areas of application depending on classifications employed at the point of discussions at hand.

Given also the possibilities that IoT have, to extend beyond present applications , especially enabled by emerging technologies in mobile and wireless computing, the scope of concerns from such a web of connectivity, should not be focused in defined areas but should have a broader scope.

The paper handles this issue in the following order:
  1. A world with IoT in place
  2. Problems with the situation
  3. Where should security start – the modalities involved – Lampson’s Access Matrix
  4. Augmented Approach Model for IoT Security – theoretical design

AAM is a good place to start, however, area that will require further research is the way the interaction between the augmented IoT applications can be controlled, because the code from numerous and possibly untrusted users and applications will be placed in the same security domain, which raises security and integrity concerns.

IoT Security is a vast topic, and this is just tip of the iceberg, with lot of nuances still unknown to us. I shall be writing more about this topic. There is no doubt in the potential of IoT in our lives, and it is going to be one of humanity’s biggest creations in this century. For us to realise its true potential, we must learn from our mistakes from the last two decades of developing software without considering security as a design principle; the numerous Cyber Security Breaches in the recent times and Incident Reports are indicators of the impact of this lack of augmented approach. But the repercussions of security compromises in IoT technologies can be far reaching, as IoT touches various levels of our social, economic and political lives.

Here is a picture showing one such scenario (Image source: http://spectrum.ieee.org/)

IoT: We can’t hide

IoT is the future of technology beyond 2020, and its one of key tools to realize United Nations Millennium Development Goals, and building security principles into IoT technologies is going to be instrumental to its use to humanity.

Further Reading:

Title Image courtesy: http://www.cmswire.com

Microsoft to end Patch Tuesday fixes

Microsoft recently showed, during their Ignite 2015 conference, some of the new security mechanisms embedded in Windows 10, which also means a change in the software update cycles, reports @iainthomson with The Register.

Terry Myerson, Head of Windows Operating System division, took a shot at Google’s approach (or lack of) in his keynote last week:

Google takes no responsibility to update customer devices, and refuses to take responsibility to update their devices, leaving end users and businesses increasingly exposed every day they use an Android device.

Google ships a big pile of [pause for effect] code, with no commitment to update your device.

The article reports:

Myerson promised that with the new version of Windows, Microsoft will release security updates to PCs, tablets and phones 24/7, as well as pushing other software “innovations,” effectively putting an end to the need for a Patch Tuesday once a month.

And,

On the data protection side, Brad Anderson, veep of enterprise client and mobility, showed off a new feature in preview builds today: Microsoft’s Advanced Threat Analytics (ATA). This tries to sense the presence of malware in a network, and locks down apps to prevent sensitive data being copied within a device…

Using Azure, administrators can choose to embed metadata in files so that managers can see who read what document, when, and where from. If a particular user is trying to access files they shouldn’t, an alert system will let the IT manager know.

Well, controls like these have been around for sometime, but most of them implemented through third party products, but its interesting to see Microsoft building these capabilities within the Operating system itself.

Microsoft’s decision to release patches whenever they are ready or available, is definitely a move in the right direction, and is in line with what Apple has been doing with Mac OS for quite sometime.

Title Image Courtesy: blog.kaspersky.com