Thursday, November 15, 2012

How Big Data Is Changing IT and Bringing Out The Vote

One of the most interesting stories to come out of the presidential election was the use of big data analytics to help campaign workers to bring out the vote. Both candidates used the technology. Project Narwhal was the Obama team’s effort. It was used to connect previously separate databases so that information on potential voters was accessible to campaign workers. They used the information to target voters with specific issues before the election and during the election they used it to determine who had not yet voted and do last minute outreach.

Big Data Changes the Process
The angle that makes this fascinating is that the information was there before, but it wasn’t accessible in a comprehensive way. The available data was the result of many siloed data gathering efforts, used for specific purposes over the past years. The role that big data analytics played was in bringing disparate information together in a comprehensive way. This meant that campaign workers could draw from information in sources such as volunteer-management programs, campaign finance and budgeting tools, voter-file interfaces and especially social media to gain a bigger picture of the voter’s profile and to guide the campaign’s success. As a result canvassers weren’t dispatched to knock on the doors of people who were already supporters of Obama, and if a donor had given the maximum contribution, instead of getting email for money they got an email asking them to volunteer.

The way that they did it was a lesson in the creative use of technology. Instead of hiring a consulting company, the Obama campaign pulled together a team of technologists who brought a range of skills and experience to get the job done. Using various programming languages and APIs they built a set of services that acted as an interface to a single shared data store for their applications. This made it possible to quickly develop new applications and to integrate existing ones into the system. As a result they were able to create a dashboard that provided access to a set of tools to use across all of their data sets and drive each step of the campaign process. The application included an analytics programs called Dreamcatcher that was developed to microtarget voters based on sentiments in social media text. Using this tool they could determine which way the vote was going and where to focus their resources. For more information on the technology see, Built to Win: Deep Inside Obama's Campaign Tech.

Saturday, November 3, 2012

Architecting Your Network to Avoid a Disaster

Every company that does business over the web, WAN, or through a data center must have a  plan to protect connectivity and assets in case any data center components fail. The key is having a good business continuity and disaster recovery solution in place. Over the last few months I’ve been working on a solution for architecting the network for disaster avoidance and recovery. I delivered a webinar about this just days before Hurricane Sandy landed on the East Coast. It’s not often that a topic is so relevant. As I watched the news a couple of days later it was shocking to see what happened to people’s neighborhoods. My first reaction was to hope that rescue efforts were underway and that people would be safe. As a few days went by I was wondering about how people were doing with getting their businesses up and running again. I wrote the need for good disaster planning in a blog in early September, see link.

What We Have Learned From Our Conversations
At Juniper we have talked with many organizations about their networks and their business continuity situation. I’d like to share some of what we learned. Many organizations tell us that they have grown organically and also through acquisitions. As a result they are concerned about inconsistent IT management policies. They have a range of applications from the organizations that they acquired and since they are often in high growth businesses and are adding applications for special projects they see server and application sprawl. Often a new CIO will initiate a process of checks on the infrastructure in an effort to see how they can normalize policies and streamline IT management. Sometimes the news of a natural disaster prompts a review the BC/DR plan.

Challenges Confronting the Organization
What organizations often find is that they are confronted with a number of challenges. They might build infrastructure without clearly identifying their application needs. This can result in poor SLA definition for applications, instead of having strict requirements with metrics. Many times they deploy infrastructure in an ad hoc manner without consistent policies. The result is many failure points, as well as difficulty managing the network and with provisioning it. Poor link utilization, with links that are frequently idle is another consequence. They often have a distributed authentication, authorization and enforcement infrastructure. The result is complex firewall policies that prevent user specific enforcement and that are deployed based on local Data Center IT policies, not global policies. These inconsistent policies for users and application access resulted in security holes.