Verizon’s acquisition of Yahoo

TechCrunch just reported that Verizon has acquired Yahoo for $4.83 billion. 

This definitely is a shocker and I am sure many would agree with me. Not many of us were expecting Marrisa Mayer to call it a day by dropping the ball so soon. 

One of the most important companies of the first dot-com boom, Yahoo, has reached the end of its life as an independent company. This deal represents a stunnin decline for a company that was valued at more than $100 billion at its its peak in 2000. 

Marissa’s roots as an engineer at Google, definitely helped in improving the brand value with software programmers and technology users alike, and she did make an effort to beef up Yahoo’s technical talent. She instituted a regorous recruitment process and it worked hard at hiring computer scientists from some of the best universities. But there is little sign that these moves changed the culture at Yahoo or improved morale among the programmers working there. They always saw and projected themselves as a “media company” and not a “technology company”. I am not sure if it played out well for them, as its attempt to be a tech company and a media company at the same time, resulted in an organisation that was less than the sum of its parts. 

I strongly believe that one reason why Verizon was a strong contender was that they have done this before; Verizon acquired another struggling Internet company last year. Like AOL, Yahoo makes a lot of money by creating Internet  content and selling ads against it. So from Verizon’s perspective, this definitely looks like a logical step.

With respect to Mayer’s future at Yahoo, I am sure she is pursuing opportunities outside, as the statement that Yahoo released about this deal, “Yahoo will be integrated with AOL under Marni Walden, EVP and President of the Product Innovation and New Businesses organisation at Verizon”, makes it evident that Marissa Mayer’s future lies outside of Yahoo. 

I wish her all the best, and am sure she will build something very interesting soon in the tech business.



Picture courtesy: TechCrunch.com

Cyber weapons and Nuclear weapons

A good essay pointing out the weird similarities between cyber weapons and nuclear weapons. 

On the surface, the analogy is compelling. Like nuclear weapons, the most powerful cyberweapons — malware capable of permanently damaging critical infrastructure and other key assets of society — are potentially catastrophically destructive, have short delivery times across vast distances, and are nearly impossible to defend against. Moreover, only the most technically competent of states appear capable of wielding cyberweapons to strategic effect right now, creating the temporary illusion of an exclusive cyber club. To some leaders who matured during the nuclear age, these tempting similarities and the pressing nature of the strategic cyberthreat provide firm justification to use nuclear deterrence strategies in cyberspace. Indeed, Cold War-style cyberdeterrence is one of the foundational cornerstones of the 2015 U.S. Department of Defense Cyber Strategy.

However, dive a little deeper and the analogy becomes decidedly less convincing. At the present time, strategic cyberweapons simply do not share the three main deterrent characteristics of nuclear weapons: the sheer destructiveness of a single weapon, the assuredness of that destruction, and a broad debate over the use of such weapons.

Questions to ask before you get your first Threat Intel data source

Anton Chuvakin (one of the leading Gartner experts in the Threat Detection space) had a recent blog post on some of the key questions one must ask while identifying the first threat Intel data source. 

Here is the list

  • What is the my primary motivation for getting TI, such as better threat detection, improved alert triage or IR support?
  • Where do I get my first threat intel source [likely, a network indicator feed, IP/DNS/URL]?
  • How do I pick the best one(s) for me?
  • Where do I put it, into what tool?
  • How do I actually make sure it will be useful in that tool?
  • What has to happen with the intelligence data in that tool, what correlation and analysis?
  • What specifically do I match TI against, which logs, traffic, alerts?
  • What you have to do with the results of such matching? Who will see them? How fast?
  • How to I assure that the results of matching are legitimate and useful?
  • What do I do with false or non-actionable matches?
  • How do I use intel to validate alerts producted by other tools?
  • Do I match TI to only current data or also to past log/traffic data? How far in the past do I go?

The post is worth a read, as he has linked his earlier posts on this topic in this blog post. Do note that the white papers he has has linked requires GTP access. 

On Tim Cook’s visit to India

This is the first time an Apple CEO has come to India. Steve Jobs had been here before, but that’s when he was soul searching and the India visit did play an important role in life thereon. 

Tim’s visit this week has been the most eventful and widely publicised, of all the large tech company CEO’s, in the recent past. I am sure this is going to have a huge positive impact on Apple’s market in India and a great benefit for Apple product lovers and customers here. 

Here is an interesting interview by The Hindu with Tim. I especially liked his reponse to a common and obvious question

Interviewer: Most of the billion people in India may not have heard about Apple. A few million would have heard and seen Apple products and only the minority few, who can afford it, would have actually used an Apple device. How would you as the CEO, explain what Apple is to this Indian audience?

Tim: Apple is about making the best products, we only create products that enrich peoples’ lives and in doing that we change the world in a positive way. That, in a simple way, is what Apple is about. Think of our products as tools to learn, teach; they empower people to do things they could not do otherwise. That’s our reason for being and that’s what drives us.

Development Centers in Bangalore and Hyderabad, and three Apple Stores supposedly in Bangalore, Delhi and Mumbai have been some of the interesting announcements. 

Interesting times ahead for Apple, consumers and entrepreneurs in India. 
Picture courtesy: dnaindia

A great list of curated Threat Intel resources

I recently found this Github Repo, put together by Herman Slatman, which consists of a list of very useful and curated Threat Intelligence resources.

The list is broken down into following five categories:

  • Sources
  • Formats
  • Frameworks
  • Tools
  • Research, Standards & Books

This is a great resource for anybody starting to dwell into the Threat Intelligence discovery, consumption and classification, as it is an ocean out there, and a lot of these “Indicators” can be noise.

 

Picture Courtesy: depositphotos.com

Live transcription of OpenVis Conference

 

OpenVis Conference is a 2 day annual conference, held in Boston, about the practice of visualising data on the web. A must for all the Data Visualisation professionals amongst us.

This time, what is interesting is, they are Live streaming the Conference, in the form of Transcript, on their site, as shown below.Screen Shot 2016-04-25 at 9.44.24 PM

The conference is being held today (Apr 25) and tomorrow, and there are some really interesting Talks lined up. Some of these concepts have direct implication to Cyber/Information Security too.

I am hoping that the Presentations will be made available for people who couldn’t make it to the conference.

Interesting Data Science projects of 2015

Here is a list of some really interesting Data Science projects of 2015. Thanks to Jeff Leek from @simplystatistics for putting this together. 
Some of my picks from the list are:

* I’m excited about the new R Consortiumand the idea of having more organizations that support folks in the R community.

* Emma Pierson’s blog and writeups in various national level news outlets continue to impress. I thought this oneon changing the incentives for sexual assault surveys was particularly interesting/good.

* As usual Philip Guo was producing gold over on his blog. I appreciate this piece on twelve tips for data driven research.

* I am really excited about the new field of adaptive data analysis. Basically understanding how we can let people be “real data analysts” and still get reasonable estimates at the end of the day. This paper from Cynthia Dwork and co was one of the initial salvos that came out this year.

* Karl Broman’s post on why reproducibility is hard is a great introduction to the real issues in making data analyses reproducible.

* Datacamp incorporated Python into their platform. The idea of interactive education for R/Python/Data Science is a very cool one and has tons of potential.

Picture Courtesy: kdnuggets.com

Adopting OODA Loop in Intrusion Detection & Response – it’s more than speed

Here is a great post by Richard able, on the concept of using OODA loop in Intrusion Detection and Response.

I have included some interesting lines here:

It is not absolute speed that counts; it is the relative tempo or a variety in rhythm that counts. Changing OODA speed becomes part of denying a pattern to be recognized…

The way to play the game of interaction and isolation is [for our side] to spontaneously generate new mental images that match up with an unfolding world of uncertainty and change…