Sorensen Consulting

Home | About | Services | Sorensen Blog | Contact

Sorensen Blog

Insights, musings and calls to action around the network

Friday, January 26, 2018

What Do We Need to Do to Address the Cybersecurity Expertise Shortage

Reprinted with permission from Cloud Harmonics 

My last blog looked at the complex, dynamic cybersecurity landscape that makes it very difficult for someone to step into a cybersecurity role and succeed.  If we are to truly start to address the cybersecurity skills gap, we need to make it easier for someone to see, understand and shut down attacks – this requires a combination of technologies, services and experiential/educational components:


More than half of respondents (55%) to a survey by Intel Security “believe cyber-security technologies will evolve to help close the skills gap within five years.” Likely this will come in the form of advances in more autonomous cybersecurity. The US Department of Homeland Services painted a picture of what this might look like, back in 2011, in the paper, Enabling Distributed Security in Cyberspace.” They described an ecosystem where “cyber participants, including cyber devices are able to work together in near-real time to anticipate and prevent cyberattacks, limit the spread of attacks across participating devices, minimize the consequences of attacks, and recover to a trusted state.”

This is in contrast to the typical cybersecurity landscape today – in which an organization has a host of different cybersecurity technologies to try to protect all their different users, systems/devices and workflows, many of which they are blind to (e.g. cloud applications) or have no control over (e.g. personal devices).  Each device requires a cyber analyst to not only deploy and manage it, but also interpret the information it produces and try to link it to other data to make sense of what is happening. Often analysts are silo’d off, responsible for protecting one part of the network or managing one type of solution, making it hard to get access to everything they need to see the bigger, complete picture. Automation and orchestration can help bring all this information together to start to alleviate these problems.

As autonomous cars and drones have grown in popularity, so have more autonomous security measures, which are better able to keep pace with the automation being employed by hackers to launch their attacks. We have seen vendors increasingly leverage artificial intelligence (AI), machine learning, orchestration and automation in an effort to accelerate an organization’s ability to identify and respond to changing cybersecurity needs. These measures can dramatically simplify the deployment and ongoing management of the security infrastructure, particularly for those elements that are manually-intensive or lend themselves to ‘black and white’ decisions (e.g. when entities or events can be easily incriminated or exonerated).

For example, a large organization can average close to 17,000 alerts a week, and only one in five alerts ends up being something. Investigating each and every alert isn’t practical or an effective use of resources, but having a solution (e.g. incident response/analytics) that can automate investigations to enable analysts to quickly understand what’s going on and prioritize their activities is sustainable. Hence, we have seen an explosion in the IR automation market – the Enterprise Strategy Group found that 56% of enterprise organizations “are already taking action to automate and orchestrate incident response processes;” Technavio has the IR system market growing at a compound annual growth rate (CAGR) of 13%.

Other cybersecurity market segments and vendors are recognizing the need for automation/orchestration/machine learning/AI to address the skills gap. Palo Alto Networks latest release (8.0) of their platform had a number of capabilities that improve the efficiency and coordination between the cybersecurity infrastructure (see our blog, xxxxx). Our colleagues at SecureDynamics have told us of they’ve experienced an uptick in demand for their Rule Migration tool, which automates the translation of legacy firewall policies to next-generation application-based rule sets. There are also open source projects, such as MineMeld, that show us how organizations can potentially use external threat feeds to support self-configuring security policies.

To truly ease the burden on cybersecurity analysts and improve the efficiency and productivity of the cybersecurity infrastructure, we need more of these kinds of innovations and automations. 


The reality is there are always times when organizations, even those with SOCs that are skilled and staffed appropriately, may need a little help. This is where services come in; we are finding there is greater acceptance that augmenting resources with a service offering can be a good way to enhance the effectiveness of an organization’s cybersecurity strategy and implementation. An outsider’s view can give organizations the knowledge they need, a fresh perspective or a new way of thinking that helps drive better decision-making and ultimately better security.  

The problem is managed security services providers (MSSP) are having to staff up themselves to meet the demand. Research and Markets predicted the MSSP sector will reach $31.9 billion by 2019, with a CAGR of 17.3% - this may be low if you consider a new report by MarketsandMarkets puts the incident response services market, one of the segments within the overall MSSP market, at 30.29 billion by 2021, with a CAGR of 18.3%.

To address the demand and protect against the ever-expanding threat landscape, these MSSPs have to build (or acquire) the talent – which is why we’ve seen some a lot movement in this space (e.g. FireEye’s acquisition of Mandiant, IBM’s acquisition of Lighthouse Security Group LLC, and BAE System’s acquisition of SilverSky, etc.).  Ultimately, being able to deliver the experience and know-how organizations need, we are back to the cybersecurity skills gap.

Educational Opportunities

Nothing replaces the knowledge and expertise of a security analyst, in terms of being able to identify, contain and fully remediate an incident. Unfortunately, as we’ve already mentioned, these folks are in short supply, so organizations need to develop this in-house talent themselves. 73% of organizations in a SANS survey indicated “they intend to plan training and staff certifications in the next 12 months.”

But what kind of training do they need to do and what kinds of skills do they need to build? Due to the aforementioned breadth of threats, threat actors, systems/devices and workflows that could be involved in a cyber incident, it’s hard to create a concrete list of things to do or know. One such attempt might focus on the layer in which they are trying to secure – e.g. network, endpoint, application, server, data, cloud, etc.; while another might look at more general areas – e.g. intrusion detection, secure software development, risk mitigation, forensics, compliance, monitoring, identity management, etc.  The reality is an organization needs to cover all these bases.

This is probably why half the companies in the “Hacking the Skills Shortage” study said they would like to see a bachelor’s degree in a relevant technical area. This gives analysts a general background that can be built upon to develop the deeper, relevant knowledge needed to better protect an organization’s specific environment.

The most effective skill building comes from real-world experience. I’m reminded of the Benjamin Franklin quote “Tell me and I forget, teach me and I may remember, involve me and I learn.” We have seen higher education institutions re-thinking the way they are structuring their learning to be much more hands on and interactive. Jelena Kovacevic, head of the electrical and computer engineering department at Carnegie Mellon University, explained to U.S. News, "At the center of meeting today's challenges is an age-old idea: Learn by making, doing and experimenting. We can do this by imbuing real-world problems into our curricula through projects, internships and collaboration with companies."

Not only seeing, but doing hacks firsthand is one of the best ways for individuals to start to identify, understand, and ultimately stop them. As a result, 68% of the respondents said hacking competitions are a good way for individuals to develop critical cybersecurity skills.

We, at Cloud Harmonics, have seen the difference that doing versus hearing or watching has on a person’s understanding. We developed our proprietary learning environment, Orchestra, to give attendees (we train more than 4000 users ever year) the opportunity to not only interact with the instructors who are leading the sessions, but also the solutions themselves. Our virtual sandbox (vSandbox) and Ultimate Test Drive (UTD) days give attendees real-world experience with solutions, in a way that enables them to see firsthand how they could deploy, use and benefit from their capabilities in their own environment.

Because there is really no substitute for experiential learning, we expect to see more users signing up to test and work with solutions in a safe environment to speed their deployment and use of advanced features in their own organization. Ultimately, to address the cybersecurity gap, it will take a confluence of technologies, services and experiential learning to build the skills and capabilities organizations need to keep up (and ideally get ahead) of all the threats targeting their organization.


9:13 am pst          Comments

Why Things Have to Change Before We Can Make a Dent in the Cybersecurity Shortage

Reprinted with permission from Cloud Harmonics - 

Reflecting on the time I recently spent with some of our sales engineers, I was reminded that one of the biggest issues faced by most of the end-user organizations we work with (through our value added reseller (VAR) partners) is a lack of cybersecurity expertise. Organizations simply can’t recruit or retain all the talent they need to mount an effective defense against all the different threats they are facing.

We’ve all seen the stats – 82% of IT professionals report a lack of cybersecurity skills within their organization; more than 30% of cybersecurity openings in the U.S. go unfilled every year; by 2019, there will be one to two million jobs unfilled in the global cybersecurity workforce.

So, why aren’t more people flocking to cybersecurity? Particularly when cybersecurity professionals are being heralded as one of the job market’s hottest commodities, in a cybersecurity market that experts predict will grow to $170 billion by 2020? I think, to state the obvious, it’s because cybersecurity is hard, and only getting harder.

Cybersecurity experts have to stay on top of all the new threats facing their organization. That’s no small task, considering: 

Cybersecurity experts also have to stay on top of the ever-growing number of highly skilled hackers targeting their organization, all of whom have different, yet extremely persistent motivations, such as: 

In addition, cybersecurity experts have to try to identify and shut down all the different vulnerabilities (and ways attackers can get “in”) throughout their organization. The universe of attack vectors is exploding, as organizations increasingly rely on:

Cybersecurity experts have to deploy, manage and maintain a range of different cybersecurity technologies to try to protect against all the threats and attackers targeting their organization. They need to monitor, identify and shut down the attack’s ability to exploit all the different attack vectors that potentially exist.

As with everything in cybersecurity, determining what needs to be implemented to defend the ongoing operations of their business and the integrity and privacy of their critical assets is anything but simple. There were almost 600 vendors exhibiting at this year’s RSA and close to 250 startups doing things in and around the event. Almost all have marketing messages that make seemingly indistinguishable claims, offering overlapping capabilities that make the marketplace complex and confusing.

It’s hard for even seasoned cybersecurity professionals to navigate, so how do we expect someone entering the field to get up to speed on everything? How do we expect them to be able to identify all the different vulnerabilities, threats and actors they could come up against? How do we expect them to learn how to use all these different systems and figure out what to do?

The simple answer is we can’t expect them to do these things until we show them how to do them. If we are to address the cybersecurity shortage and recruit and retain vital cybersecurity personnel, we are going to have to change our expectations and adjust our approach. If we don’t, the cybersecurity skills gap is only going to get wider. For my thoughts on what these expectations should look like and what the approach should be to develop new talent to start to better address the skills shortage, check out part 2 of this blog series…   

9:10 am pst          Comments

Monday, April 1, 2013

Top Open Source SDN Projects to Keep Your Eyes On

Interest and momentum around OpenFlow and software defined networking (SDN) has certainly been accelerating. I think people are so excited about SDNs because, while we have seen a lot of innovation around networking – in the wireless space, the data center, and all the applications – there has been very little innovation in networking – the routers and switches – within the last decade. The prospect of completely re-architecting the network, by separating the control plane from the data plane, opens up a lot of new possibilities.

With SDNs, organizations aren’t constrained by how the network is built. They are free to build a dynamic, fluid infrastructure that can support fluctuating demands, shorter implementation cycles (check out Stanford’s Mininet), and completely new business models. But, as I have mentioned before, we are just at the beginning. While those of us watching this space have been impressed by the rapid pace of innovation within SDNs to date, it’s hard to predict what’s going to happen next. But that won’t stop us from trying!

I spent the last few weeks checking in with some SDN pioneers to find out what’s going on that’s of interest in the SDN space these days. Among those experts whom I spoke with were Chris Small (CS), Network Researcher at Indiana University, Phil Porras (PP), Program Director at the Computer Science Lab of SRI, and Dan Talayco (DT), Member of the Technical Staff at Big Switch Networks. The following are some excerpts from my discussions:

What are the top projects in your mind going on right now around OpenFlow and SDNs?

DT: “It’s hard for me to choose just a couple to talk about.  Which is a great thing, isn’t it?  There are three very different parts of the ecosystem in SDN.  First, there are the switches providing the infrastructure that moves packets. Then there are controllers. This is a layer of centralized software controlling the forwarding behavior of the infrastructure (most often through the OpenFlow protocol) and providing a platform for the third layer, which is all the SDN Applications. These are software programs that run on controllers. They are given visibility into the topology of the network and are notified of events in the network to which they respond.

Here are four open source SDN projects I’d point to.  I’m more familiar with the lower two layers (switches and controllers), so mine are from there:

Floodlight is an open source controller in Java.  It was introduced less than a year ago I believe, but has been getting rapid acceptance in the OpenFlow community. Currently it has more public forum discussion traffic than all other controllers combined.

Open vSwitch (OvS) is a multi-layer virtual switch released under the open source Apache 2.0 license.  Its focus is primarily as a virtual switch, though it has been ported to various hardware platforms as well.  Some of the originators of OpenFlow created OvS.

OFTest was developed at Stanford.  It’s a framework and set of tests implemented in Python that give people a way to validate the functionality of their OpenFlow switches.  There was even a simple software switch written in Python to validate OpenFlow version 1.1 that is distributed with OFTest.

Indigo is a project, also started at Stanford, providing an implementation of OpenFlow on hardware switches.  It runs on several hardware platforms and has been used in a number of different environments.  This project is currently being updated to describe a generic architecture for OpenFlow switches targeting hardware forwarding.”

CS: “While the work that’s being done with the Controllers is very important, I think the most interesting pieces to look at are the actual applications. These help us make sense of what’s possible. The first one that I think is interesting is one we are doing at Indiana University. We have an OpenFlow load-balancer in FlowScale. We have deployed it out in our campus network, in front of our IDS systems, and are taking all of our traffic through it (48 port by 10Gig switch). It does all the routing, fail over, etc. you would want a load balancer to do, but cheaper than an off-the-shelf solution.

The other key project I would look at is the work that CPqD is doing. They are basically a Brazilian Bell Labs, and they are working on RouteFlow to run a virtual topology with Open Source software and then replicates the virtual topology into the OpenFlow switches. This is how you can take a top-of-rack switch and convert it into a very capable router and integrate a lot of different capabilities needed for research, campus and enterprise deployments.”

PP: “I’ve been looking at this space with respect to security and think there are a few core strategies that researchers are exploring to see how best to develop security technology that can dynamically respond to either threats in the network or changes in the OpenFlow stack. The idea is to monitor threats and then have the security technologies interact with the security controllers to apply new, dynamic mediation policies.

There is FlowVisor, led by Ali Al-Shabibi out of Stanford and Rob Sherwood (who used to be at Stanford, but is now at Big Switch), which works to secure network operations by segmenting, or slicing, the network control into independent virtual machines. Each network slice (or domain) is governed by a self-contained application, architected to not interfere with the applications that govern other network slices. Most recently, they started considering whether the hypervisor layer could also be a compelling layer in which to integrate enterprise- or data center-wide policy enforcement.

We [at SRI] have been working on FortNOX, which is an effort to extend the OpenFlow security controller to become a security mediation service – one that can apply strong policy in a network slice to ensure there is compliance with a fixed policy. It’s capable of instantiating a hierarchical trust model that includes network operations, security applications, and traditional OpenFlow applications. The controller reconciles all new flow rules against the existing set of rules and, if there’s a conflict, the controller, using digital signatures to authenticate the rule source, resolves it based on which author has highest authority.

CloudPolice, led by Ion Stoica from U.C. Berkeley in concert with folks from Princeton and Intel Labs Berkeley, are trying to use OpenFlow as a way to provide very customized security policy control for virtual OSs within the host.  Here, the responsibility for network security is moved away from the network infrastructure and placed into the hypvervisor of the host to mediate the flows with custom policies per VM stack.

 The University of Maryland, along with Georgia Tech, the National University of Sciences and Technology (Pakistan) are working on employing OpenFlow as a delivery mechanism for security logic to more efficiently distribute security applications to last hop network infrastructure. The premise is that an ISP or professional security group charged with managing network security could deploy OpenFlow applications into home routers, which is where most of the malware infections take place, to provide individual protection and better summary data up to the ISP layer (or other enforcement point) to produce both higher fidelity threat detection and highly targeted threat responses.”

Why are these projects important?

DT: “Because controllers are the pivot between switching and SDN applications, it’s a really important part of the system to develop right now.  This is why I think Floodlight is so important.  It’s been exciting to see the growing public contributions to the basic functionality and interfaces that were originally defined.  I think a full web interface was recently added.

What’s important is changing, though, because of new projects and the rapidly growing eco system we are seeing. For instance, OFTest has started to get more attention again, partly because we’ve been adding lots of tests to it and partly because the broader ONF test group has been developing a formal test specification.

OpenFlow on hardware is still interesting to me because I think being able to control and manage the forwarding infrastructure via SDN will be important for the foreseeable future and maybe forever.  This is why I continue to be active in Indigo.”

CS:FlowScale is a proof point of the flexibility of OpenFlow and its potential to enable innovation. If you have an application that you want to deploy out, you don’t have to wait for vendor implementations, don’t have to wait to get hardware that’s capable, you can take existing hardware and a little bit of software and implement it very quickly. For example, we have been working with other researchers who are interested in new multi-cast algorithms or PGP implementation, instead of having to wait for major vendors to decide it’s okay to put in their hardware, we can very inexpensively implement it, try it, at line rate, and then deploy it more widely.

It’s a little like the stuff that ONRC, the collaboration between Stanford and Berkeley, have been working on the past years. They are doing a lot of proof of concept applications with OpenFlow and continue to push new ideas out. They are taking new research and building implementations that can be used in the future for new products. These applications are further out, but it gives you ideas around what can maybe be expanded on and made into new products. They have worked on a number of research projects – such as Load Balancing as a network primitive (which we incorporated into FlowScale) and their recent Header Space Analysis which can verify the correctness of the network to ensure the policy of the network match its actual physical deployment.

Routeflow is important because it proves you can remove the complexity from the hardware and get the same capabilities; it puts all the features and complexity in the PCs rather than the switches. We have been working with them on a demonstration of it at the Internet2 Joint Techs Conference, where we are going to show RouteFlow operating in hardware switches as a virtualized service deployed on the Internet2 network. This is the first time we have seen anything like this on a national backbone network.”

PP: “The security projects represent two branches of emphasis: one focused on using SDNs for more flexible integration of dynamic network security policies and the other for better diagnosis and mitigation. One branch is exploring how and where dynamic network security can be implemented in the OpenFlow network stack: the controller (control plane), the network hypervisor (flowvisor), or even the OS hypervisor.   The other branch is attempting to demonstrate security applications that are either written as OpenFlow applications for more efficient distribution or are tuned to interact with the OpenFlow controller to conduct dynamic threat mitigation.”

What are some of the hurdles?

DT: “The rapid change in the OpenFlow protocol specification has been a challenge we’ve all faced.  It’s probably a symptom of the desire to drive change into these projects as quickly as possible.  OvS, for instance, has not been updated since 1.0, though it has a number of its own extensions.

The second challenge faced by those working on open source, especially at the protocol level, is that there are often conflicting requirements between generating code which can be a reference to aid in understanding, versus code which can provide a basis for developing production quality software.

The Indigo project has suffered from two other things: first are the high expectations that it should provide a complete managed switch implementation, which normally involves a large company to implement and support, and second because there is still a significant component that’s only released as a binary. I think as the community goes forward, we are going to see additional work that’s going to make it a lot easier to use all these tools and products in many environments.”

CS: “Right now OpenFlow projects on hardware switches are still immature. It’s important to recognize it’s a different technology, with different limitations and there are some things that are simply not possible right now. But if you don’t need that complete list of features, then it may make perfect sense to use some of these applications. Looking at the space, it’s easy to recognize that things are moving a long quite rapidly, with new vendors, specifications, hardware support, etc. every day, so things will catch up and we can implement many things that are not possible right now.”

PP: “The entire concept of SDN appears to be antithetical to our traditional notions of secure network operations. The fundamentals of security state that at any moment in time you know what’s being enforced. This requires a well-defined security policy instantiated specifically for the target network topology, that can be vetted, tested and audited for compliance.

Software defined networks, on the other hand, embrace the notion that you can continually redefine your security policy.  They embrace the notion that policies can be recomputed or derived just in time, by dynamically inserting and removing rules, as network flows or the topology changes. The trick is in reconciling these two seemingly divergent notions.

In addition, OpenFlow applications may compete, contradict, override one another, incorporate vulnerabilities, or even be written by adversaries. The possibility of multiple, custom and 3rd-party OpenFlow applications running on a network controller device introduces a unique policy enforcement challenge – what happens when different applications insert different control policies dynamically? How does the controller guarantee they are not in conflict with each other? How does it vet and decide which policy to enforce? These are all questions that need to be answered in one way or another.

I think it’s best to have these conversations about how we envision securing OpenFlow and empowering new security applications now. Security has had a reputation of being that last to arrive at the party.  I think this is a case where we could assist in making a big positive impact on a technology that could, in turn, provide a big positive impact back to security.”

What Does the Future Look Like for Open Source and SDNs?

DT: “I think we are going to see new architectures and reference implementations that will accelerate the deployment of SDNs in the very near future.  People are often dismissive of ‘one-off’ projects, but the reality is that we face a host of problems; each of which requires a slightly different solution, while all of them can be addressed by SDN approaches.  These projects are already coming out of the wood work as more people better understand SDN.  I’ve heard a few people start to say ‘the long tail is the killer app for SDN.’”

CS: “I believe there will be bottoms up adoption, where more and more applications are implemented until there is critical mass and it makes more sense, from a time and cost perspective, to not have to manage two different networks – traditional and SDN-based. When that happens I think we will see a switch to SDNs.”

PP: “OpenFlow has some very exciting potential to drive new innovations in intelligent and dynamic network security defenses for future networks.  Long term, I think OpenFlow could prove to be one of the more impactful technologies to drive a variety new solutions in network security.  I can envision a future in which a secure OpenFlow network:

  • incorporates logic at the control or infrastructure layer to mediate all incoming flow rules against an organization’s network security policy in a way that can’t be circumvented and is complete.
  • allows the full dynamism of OpenFlow applications to produce optimal flow routing decisions, while being free to remain unaware of the current security policy and not depended upon to preserve network security. Rather, operators will trust that security enforcement will occur at the control or infrastructure layer.
  • enables InfoSec practitioners to develop future powerful OpenFlow-enabled security applications that can dynamically reprogram flow routing to mitigate threats to the network, remove or quarantine assets that violate security or fail to exhibit runtime integrity, and react to network-wide failure modes.

When we can achieve all three of these, we’ll be able to provide some compelling reasons why OpenFlow has a distinct advantage over existing networking, while instilling the confidence we need to embrace all the other benefits of SDNs. I believe we can reconcile static and dynamic policy enforcement and create all new mitigation services that are much more intelligent and effective countermeasures to better defend our networks.”

10:41 pm pdt          Comments

Monday, April 26, 2010

Protecting Children Online - Part II: Quick Tips

My last <a href="">blog</a> focused on some general guidelines to protect our children online, here are some quick, concrete tips to keep them safe:

--  Make sure usernames/screen names/email addresses do not have any personally identifiable information

Stay away from initials, birthdates, hobbies, towns, graduation year, etc.

The smallest piece of identifiable information could lead a predator to you - remember they are highly motivated

--Don't link screen names to email addresses - if a child gets an email they tend to think it is okay, it's not. Reiterate that if they don't actually know the person, they are a stranger, regardless of how they contact them.

--Set up their buddy/friends list and regularly update and check them to ensure your kids are only interacting with people they actually know; this goes for their phone too.

--Don't post personal information - don't respond to requests from people OR companies

eMarketer found that 75% of children are willing to share personal information online about themselves and their family in exchange for goods and services

--Keep the computer in a public part of the house
--Consider limiting the amount of time they can spend on their phone, iPod, iPad, computer, etc. to whatever you deem as reasonable.

--Regularly check their online surfing history - know exactly where they are going and talk to them about it, so they know you know.

--Use filtering software to prevent access from things you know are bad. Note: only 1/3 of households are using blocking or filtering software.

--Protect your computing resources

Use parental controls - check out Norton's family plan as an example of tools you can consider installing

Here's a <a href="</blockquote>&queryText=keeping%20track%20of%20sights%20visited">list from InformationWeek </a>on security technologies (protection from viruses, bots, Trojans and other malware) you might want to consider

Note be sure to use software from a reputable source, otherwise you may be unwittingly downloading malware that can do more harm than good

Make sure it offers a wide range of protection - different attacks use different methods to infiltrate your computer and you want full coverage

--Follow good rules of thumb

Don't open anything (emails or attachments) from anyone you don't know

Don't open anything that looks a little too good to be true - it probably is

Make sure your email doesn't automatically open emails - check your settings

10:19 am pdt          Comments

Protecting Children Online - Part One

Kids will be kids; they will be curious, test boundaries, and do things that show less than stellar judgment. As parents, we try to guide, support and love them to keep them safe and on a productive path. Inevitably, our efforts collide- you've all seen the tween/teen TV dramas - the problem is in this digital age the opportunities for unhappy outcomes have grown. 

This just means we have to redouble our efforts; we need to connect with our kids and give them the tools they need to navigate and stay safe both in the physical world and online one.  From day one, we teach our kids to look both ways before crossing the street, to never take anything or go anywhere with strangers, to walk away from a fight, to speak up when someone is not being nice, to say no to drugs, etc. We need to also teach our kids to do the same things when they go online.

<div style="border-top: thin gray solid; border-bottom:  thin gray solid; padding: 20px; margin: 20px 2px; width: 46em;"><a href=""><img style="float: left; border: none;padding-right: 10px;" src="" /></a>Sarah Sorensen is the author of <a href="">The Sustainable Network: The Accidental Answer for a Troubled Planet</a>.<br /><br />The Sustainable Network demonstrates how we can tackle challenges, ranging from energy conservation to economic and social innovation, using the global network -- of which the public Internet is just one piece. This book demystifies the power of the network, and issues a strong call to action.<br /><br clear="left"></div>We need to remove the idea that stuff online is "not real," or that it doesn't have consequences. We need to drill into them that they will be held accountable for what they do and say when they are online, just as they would be when they are at home or at school. Explain to them that they need to think before they post and they don't have a right to post whatever they want. For example, "sexting" or sending racy photos to your boy/girlfriend is not harmless, even if they are the same age as you; those messages can go everywhere and could be considered child pornography.  <a href="">Cyberbullying</a> is a real problem, with real consequences - threatening someone online is just the same as threatening them on the playground.

Actually the online world opens up new ways for predators or bullys to get at their victims. Unlike the bully on the playground that your child is able to get away from when they go home, the cyberbully is able to follow your child wherever they are. They can send menacing texts to your child's phone, make hurtful comments on their Facebook page, take and post photos of them with their digital cameras, and pop up and threaten them as they interact in digital worlds and games (such as <a href="">Gaia</a>, <a href="">Second Life </a>and <a href="">World of Warcraft</a>).

We need to ensure they protect themselves; that they are aware of their surroundings and understand that they shouldn't trust anyone that they don't physically know. As I mentioned in a past <a href="">blog</a>, there are three guiding principles that can help kids stay safe:

1. Don't share any personal information
2. Remember that everyone is a stranger
3. Know there is no such thing as private

But, let's face it, even the best kids (and adults) make mistakes. It's inevitable. They get curious or drop their guard, or do something without thinking through all the consequences.  

<blockquote>By the way there is <a href="">new research </a> that provides some insight to the question that most of us parents have asked, "what were you thinking?" - it turns out that children's brains (until their mid-20s) may not be as adept at thinking through the consequences of their actions because their brains process information differently than adults. (hmmm, what's my excuse?) </blockquote>

At these times, it's good to remember why kids go online in the first place. It may be they are looking to figure something out, want to fit in or belong, hope to be popular, or want to escape reality.  The best thing we, as parents, can do is understand why our children are going online - are they researching for school, playing video games, chatting with their friends, exploring, etc.?  We need to talk to them, get involved and know exactly what they are doing, so we can monitor their behavior and identify changes that might indicate something is wrong.

And sometimes, they find themselves in situations that they didn't intend to get into and are uncertain how to extract themselves from.  At these times, we hope they turn to us, their parents, for help, so we can work through the problem together. However, they are often afraid to come to us because they:

1. Don't want to be restricted from using the computer - which may be their social lifeline
2. May not want to expose the offender (typically in cases of abuse, the victim has formed a relationship with the abuser, who has invested the time to gain their trust and be their "friend" - for a child, the average predator will talk to them for 4 to 6 months before approaching them for more)
3. Believe the threats of the offender that something bad will happen to them or their family if they tell
4. May fear punishment for their own bad behavior or participation the activity
5. Are embarrassed that they fell for the scam or were used in this way

Understanding why they may not approach a parent is important, so you can try to address these fears head on.  Again, there is no substitution for ongoing communication; but research shows that only 15% of parents are "in the know" about their kids' social networking habits, and how these behaviors can lead to cyberbullying. So, talk to your kids about the dangers and look for changes in their behavior. Have they suddenly lost all interest in going online? Do they shun their phone after getting a few texts? Are they irritable or demonstrating big mood swings? 

Offer them a safe environment where they participate in online activities. Make sure they know you are paying attention to what they are doing while online, and ensure they know they can confide in you and ask for your help the second something feels strange or uncomfortable. Apply the same good parenting skills and tactics that you would use in the physcial world to your child's activities in the online world to help keep them safe.  And just as generations past, we should strive to ensure they have the tools they need to go out on their own and navigate the world; it's just that the world is a lot more connected now, presenting our children with both greater risks and possibilities.  

10:19 am pdt          Comments

Opinion - How the Role of the F.C.C. Impacts Internet Providers

On April 6th, a federal appeals court ruled that the F.C.C. did not have the authority to regulate how Internet service providers manage their network. At issue was Comcast's right to slow customer's access to the bandwidth intensive, file-sharing service BitTorrent. While they can now limit traffic that is overloading the network, Comcast was careful to say that it had changed its management policies and had no intention of doing so.

These comments were most likely to ease the minds of those who recognize the affect that this court ruling has on the F.C.C.'s authority to mandate "net neutrality." Advocates of net neutrality worry that this decision is going to give providers free reign to control what a user can and cannot access on the network. 

It is this point that many of the media outlets focused on, turning this case into a potential watershed moment for watchdogs looking for unfair and biased treatment of traffic by Internet service providers.  A single instance of seemingly preferential treatment of one type of content over another could end up causing a provider to lose the trust of their customers. It could also be reason enough for Congress to step in and explicitly grant the F.C.C. the authority to regulate.

As such, it is more important than ever for Internet service providers to be transparent in their actions to sustain customer loyalty. They need to make sure customers know how they plan to manage their networks and what to expect in order to build trust and a lasting relationship.  Given that the national focus is on increasing Americans' access to high-speed Internet networks, anything seen to be contrary to achieving that goal, regardless of whether it is real or simply perceived, will have very negative connotations on the brand of that provider.

This is probably why Comcast's statement around the verdict was subdued and focused on the future: "Comcast remains committed to the F.C.C.'s existing open Internet principles, and we will continue to work constructively with this F.C.C. as it determines how best to increase broadband adoption and preserve an open and vibrant Internet."

Providers who want to allay customer fear and skepticism around their motives should make an extra effort to reaffirm their commitment to providing high-speed access and high-quality services. They should start to have an authentic, ongoing dialogue (that is threaded through everything from their Web and social media communications to policies and procedures) that explains the challenges associated with supporting all the different demands of high-bandwidth applications and exactly what they are doing or are going to do to meet these challenges.  Only if customers trust that they are providing an equal opportunity service will providers be able to sustain their business without a lot of regulation.

10:18 am pdt          Comments

Hard Drives Can Pose Risks to Sustainability

Extending the use of computing devices is critical if we are to create more sustainable consumption. We can divert waste from landfill and reduce the energy it takes to extract materials and build new devices, if we can lengthen the life of the devices we already have or find new ways to use its components.

I think most of us try to recycle our devices and are happy to pass along those that have outgrown our needs. But what if its reuse poses a risk to you?  Hard drives can pose such a risk and, as such, often have their lives and usefulness cut short.

What do you do with your hard drive, which often houses all of your intellectual property and sensitive information, when you are done with it? How do you make sure your information isn't found and used by someone else? Just deleting the information off of it doesn't mean it's gone, it is not too difficult to get the data back. (Something I am often thankful for when I delete a file by accident, but which opens up a huge risk when you really want to get rid of the information.) Even when your hard disk is corrupted or physically damaged, all is not lost (just do a quick <a href="">search</a> on hard disk recovery and you will find a whole host of sites and solutions that will help you recover the information).

<div style="border-top: thin gray solid; border-bottom:  thin gray solid; padding: 20px; margin: 20px 2px; width: 46em;"><a href=""><img style="float: left; border: none;padding-right: 10px;" src="" /></a>Sarah Sorensen is the author of <a href="">The Sustainable Network: The Accidental Answer for a Troubled Planet</a>.<br /><br />The Sustainable Network demonstrates how we can tackle challenges, ranging from energy conservation to economic and social innovation, using the global network -- of which the public Internet is just one piece. This book demystifies the power of the network, and issues a strong call to action.<br /><br clear="left"></div><br />It's no wonder that organizations that can afford them have "disk drive chippers" that completely destroy a hard drive once it is no longer needed, so that no data can be recovered from it. Others go a more conventional route and use what a colleague of mine calls "Fred Flinstone" or "Young Dr. Frankenstein" techniques - you get the picture.

But wouldn't it be more sustainable if we could extend the life of that device? What if there was a reliable way to permanently erase the data on it without having to shred the device?  Just because the model is no longer of use to you, it is very likely it would suit the needs of someone else. We could divert that device from landfill for a little while longer. Then, because we have a way to erase the data, we could explore recycling and reusing the components to further reduce waste.

This is something that has been done with cell phones and copiers; they often receive an extended life in the hands of those who find an older model perfectly suitable. (I know I have donated my cell phone in the past; it's easy to <a href="">search </a>to find organizations in your area who have needs.) But is this safe to do now?

In the past, phones were only used for voice calls - the data potentially exposed consisted of your phone book. Remove your SIM card and you could be fairly sure that future users would not find anything personal left on your phone.  Today's smart phones have the computing power of many desktops; they are being used to conduct our business and personal lives. Ever search the Web? Take a photo? Check your bank account? Pay a bill? Read your email? Download a file? Think of all the data that is potentially on your smart phone stored on the hard drive that now sits on that phone... how do you make sure that it is gone when you are done with the phone? Does this mean we are back to destroying the device? Again, it would be great to know that we can reliably erase the data, so the device can be used by someone else.

Same thing with photocopiers; over the past five to seven years, most copiers are networked to a variety of computing devices and each have a hard drive that records all the information that is copied, printed, faxed or scanned. Since most organizations don't want to spend the capital to buy a copier they lease it from a provider (which also enables them to offloading the repairs and maintenance). When the lease is up, the copier provider will come, delete the data, and send it off to another customer. But we have already mentioned that simply deleting data doesn't mean it is gone. So these copiers can provide a wealth of information to those who know to look for it. (Check out  <a href="">this site</a> to get some tips on how to protect yourself when using a copier). Again, this doesn't make it a sustainable solution.

So what can you do? As an organization, you
• Need to first put in place a proactive data leak prevention program; because only after you are sure you can identify all the potential risks, can you put the processes or technologies in place to mitigate them.
• Consider using an enterprise-class disk management program that adheres to any of the eradication standards used by many international governments and military (such as DoD 5220.22, Gutmann method, Schneier Standard, AFSSI 50220, NAVSO P5239-26, VSItR, AR 380-19, GOST P50739-95, Crypto-secure Random Data.
• Ensure you can securely delete data from hard drives, including "locked" or "in-use files."
o This requires overcoming some operating system limitations that exist to ensure continual operation - which is what you want when you are using the system, but not so great when you want to get rid of the data.
o So, make sure you are able to delete all the different file systems from all the different operating systems you have on the device.
• You also want to make sure that you can eliminate "zombie-data" stored in the recycle bin or in the blank space of the hard drive.

For individuals:
• You can download software that enables you to erase hard drives, such as Active@KillDisk or LSoft Technologies. They write over the data, because deleting and reformatting the drive doesn't actually delete it.
o Note, data that has been written over only one or two times can be recovered; however, it takes expensive equipment to do. So unless you are expecting a super sleuth or crime lab to want to read your data, you are probably safe.
o If in fact you are worried about professionals taking the time to get at your data (you probably have bigger problems than I can imagine!), experts recommend rewriting the data seven times to make sure it is unrecoverable.
o Make sure you pay attention to those files that are "locked" or "in-use" and "zombie data"- you don't want to leave them on the drive.
• Something to think about is the ability to remotely initiate and manage an erasure, so that if your phone or computer is lost, you can delete the data as soon as it connects to the network.
o Some operating systems have a "kill pill" feature that allows you to remotely erase and lock it, make sure it's enabled. 

Once the hard drive no longer poses a risk, it can be reused. The goal is to promote a more sustainable way to use technology, so we can reduce our impact and drive change on a global scale. 

10:18 am pdt          Comments

Online Dangers - Three Principles Every Parent Should Instill

I believe strongly in the potential of the network - heck, I wrote a <a href="">book </a>about it - however, I also understand the same connections that can be used for good can also be used for bad. And the reality is they can be downright dangerous for our children, who can be bullied, stalked and targeted online.

How prevalent is it? The statistics are alarming. One in five teenagers in the US have received an unwanted sexual solicitation online acorrding to the <a href="">Crimes Against Children Research Center </a><a href="">Child pornography </a>is one of the fastest growing businesses online. The National Crime Prevention Council suggests that more than half of American teens are exposed to some sort of <a href="">cyberbullying</a> and the Kids Helpline found as many as 70% were harassed online.

Unfortunately, these statistics became more personal for me when I learned of a recent incident in our local middle school. And if you are thinking, "Well that's there, it's not happening in our school district," you may want to check with your city's police or even just search your local news; you will find these crimes can and are taking place everywhere. So what can you do?

As a parent, it's natural to want to remove the threats and simply shut down your children's access to the Internet. But are you really prepared to not only cut off access to their computer, but also their cell phone, digital camera, iTouch, video game consoles (Wii or PlayStation), etc.? Let's face it, we live in a digital age and the network is embedded in almost everything we do; so rather than ban it, we need to teach our children how to use it safely and effectively.

I think the following three principles are a good start. Every parent should make sure their kids:

1. <strong>Do not share any personal information </strong>- Most obvious is name, age, school, hometown, etc.; less obvious, but no less telling for someone who is paying attention and motivated to figure it out are photos with a school jersey, the name of your local park, the location of your vet, the theater you are going to be at on Friday night, etc.  Don't reveal anything that could enable someone you don't know to figure out who you are and find you.
2. <strong>Remember that everyone is a stranger </strong>- Unless you actually know them, meaning they are a family member, a neighbor, someone you go to school with or know from clubs and extracurricular activities, they are not your "friends," they are strangers. You should not talk to them, take any gifts they may offer, or agree to do anything for them. Unlike the stranger in the mall, where you can at least see them; when you meet someone online you have NO IDEA who they really are. Don't engage.

3. <strong>Know there is no such thing as private </strong>-  When you are online, the information you put out there can be found and accessed by almost everyone. This goes for texts, photos, videos, etc. Think before you post anything - is it something you want to see on the front page of a newspaper? If not, don't do it.

And of course, the most important thing that our children need to know is that they can come to us, no matter what, and we will help them. As in the physical world, there is no substitute for being involved in their lives and that goes for their online activities. Make sure they know you are there and that should anything uncomfortable or threatening arise, you will support them.

10:17 am pdt          Comments

F.C.C. Plans Have Potential to Accelerate the Roll Out of the Sustainable Network

Tomorrow, the F.C.C. is putting forth to Congress a 10-year plan focused on developing high-speed Internet access as the dominant communications network. Up for debate includes a recommendation for a subsidy for Internet providers to wire rural parts of the country, an auction of broadcast spectrum for wireless spectrum (the goal is to free up roughly 500 megahertz of spectrum, much of which would come from TV broadcasters, for future mobile broadband uses), and the development of a new universal set-top box that connects to the Internet and cable service.

The proposal includes reforms to the Universal Service Fund to focus on broadband access and affordability. It also call for a "digital literacy corps" to help unwired Americans learn online skills, and a recommendation for $12 billion to $16 billion for a nationwide public safety network that would connect police, fire departments and other first responders.

It strives to put a stake in the ground for standard broadband speeds, with the promise that the F.C.C. will begin assessing the speeds and costs of consumer broadband service. In conjunction, consumers will be encouraged to test the speed of their home Internet access through a new suite of online and mobile phone applications that will be released by the F.C.C. to see if they are getting the promised speeds for which they are paying.

This move by the F.C.C. comes on the heels of Google, who announced they would offer ultrahigh-speed Internet access in a few communities to showcase what's possible with faster broadband networks. This move by Google was seen as a prod in the direction now being taken by the F.C.C. to make sure that high-speed networks are truly available nationwide.

What this will do to the industry of network providers who are currently trying to carve out their place and create business models that will enable them justify the investments that need to be made to create this high-speed network reality is yet to be determined. But it is clear, this move by the F.C.C. will have an affect on public policy for years to come and definitely puts pressure on the network offerings of existing providers. Stay tuned. It is going to be an interesting journey; one that has the potential to bring the best platform we have for sustainable progress, change and action to us all.

10:16 am pdt          Comments

Reflections on RSA - Security is Really a Control and Data Management Problem

This week, I spent some time at <a href="">RSA</a>, an event where security vendors and professionals connect. As I have mentioned in past <a href="">blogs</a>, security is paramount to the sustainability of the network. If we are to leverage the network as a powerful tool for change, we need to be able to trust that the information and resources on it are secure.

As recent headlines have demonstrated, attacks on the network are ever-present; 2009 saw <a href="">malware and social networking attacks surge</a> (spam carrying malware was averaging 3 billion each day by the end of the year) and <a href="">increasingly sophisticated mobile attacks </a>emerge. Just as in the physical world, there are individuals motivated by greed, power and personal gain (the <a href=",289142,sid14_gci1389667,00.html">rise </a>and <a href="">co-opting </a>of the <a href="">Zeus attacks</a>, which originally targeted financial institutions, is just one example - to date it has infected about 74,000 PCs, and that's just one attack), and there are those who are looking to achieve <a href="">political</a> or ideological ends.  

But, as the show floor and conference discusssions demonstrated, there are a lot of technologies out there designed to help organizations combat and mitigate against all these attacks. There are literally thousands of companies, focused on everything from user and data authentication to spyware and cloud security. So why is it that even though there is an answer or feature out there for almost every threat or need, organizations are still struggling to protect the network? I think it's because security is more of a control and data management problem than a feature-set issue.

I heard <a href="">Palo Alto Networks </a> talk about controlling exactly what should and should not be allowed on the network, based on the user and their role, the application and exactly what they are trying to do. This approach makes sense because with a focus on control, you can eliminate a lot of the risks right off the bat. You can restrict peer to peer traffic and file sharing applications that can be used by attackers to gain access to the network (through malware/trojans) and all its resources. The key is to have this level of control over every aspect of your network, from the edge to the core and within the hosts themselves, and then, for what is allowed, look for threats and mitigate attacks within that "allowed" traffic.

This gets us to the data management problem; a typical network's security infrastructure contains multiple different devices, each with different management consoles, each producing a lot of logs that can contain thousands of pieces of information. Linking all this data and making sense of it all requires a lot of manpower and expertise. Oh, and don't forget that physical security measures, which can also provide clues and contain indicators of risks, are kept almost entirely separate from the network security activities (typically they are run by two different groups with very little connection, though I did see a <a href="">company</a> that was trying bridge that gap).

I think it is telling that it took Google and a host of other companies targeted by attackers originitating in China <a href="">MONTHS</a> to figure out exactly what happened (in fact, I believe the investigation is still going on now). So, under the cover of the data deluge that network administrators are under from all these different security devices, attackers can infiltrate a network and operate undetected. 

All of the calls to better manage business information and increase the value derived from insights and analysis of that information (take a look at last week's Economist's special report) need to be applied to network security. Organizations need a singular, meaningful view into the network that helps them identify in real-time what is going on and any threats to that network. To date, I haven't seen big advances on this front, sure there are the large, generic platforms offered by the likes of HP and IBM and security-specific management platforms from folks such as ArcSight. I would love to hear from you if you have seen promise in this area. Right now, I think we need more innovation; we need truly comprehensive visibility and the ability to easily and actively control and manage of the network. The security and ultimate sustainability of the network as a platform for change is reliant on it.

10:16 am pdt          Comments

Are Books Dead - What Happens When Too Much Information Isn't Enough?

Information is more accessible than ever, and more content is being created on a daily basis than existed in the world 100 years ago. In fact, three years ago, IBM predicted that by 2010 the amount of digital information on the Internet would be doubling every 11 hours! I am not sure if we are there yet, but that milestone is likely not that far off.

As recent as the middle of last century it was reasonable to assume that a scientist or doctor was generally knowledgeable about any type of science or medicine; they could stay apprised of new discoveries, theories or applications in all different fields of study through regular reading of scientific or medical journals. Now, due to the sheer volume of information and advances occurring around the world, scientists and doctors are only able to keep up with their area of study or specialization, and it is unreasonable to think they would have a level of depth and greater understanding in all areas outside of their particular field.

<div style="border-top: thin gray solid; border-bottom:  thin gray solid; padding: 20px; margin: 20px 2px; width: 46em;"><a href=""><img style="float: left; border: none;padding-right: 10px;" src="" /></a>Sarah Sorensen is the author of <a href="">The Sustainable Network: The Accidental Answer for a Troubled Planet</a>.<br /><br />The Sustainable Network demonstrates how we can tackle challenges, ranging from energy conservation to economic and social innovation, using the global network -- of which the public Internet is just one piece. This book demystifies the power of the network, and issues a strong call to action.<br /><br clear="left"></div>

So, how do we navigate this digital information world? How do we try to maintain a real-time understanding of all the things that are important to us? Well, this is where the services and news feeds offered by the Twitters, and Facebooks, and Googles of the world come into play. Through short bursts of information, we are able to stay up to date on our friends and family, local and global communities, activities of interest, etc. Through innovative use of technology and the myriad of applications and services that are delivered by the network, we are constantly finding new and useful ways to search, synthesize and package information, distribute it to interested parties and foster a dialogue that can be global in scope. 

But is this enough? As we struggle to stay on top of everything that crosses our paths, are we missing opportunities to get more out of the information? Are we becoming too much of a "right now" society? Are we able to delve into an issue at length or stick with a topic that doesn't have a quick pithy answer?

My fear is that in our quest for quick information, we may be losing a vital tool in books that have helped us for generations formulate new thoughts, prod and poke at existing conventions, think through the universe's toughest questions and open our eyes to the possibilities. The book is one of the few written word formats that enables topics to be explored and expanded upon in hundreds of pages. As the journalist Edward P. Morgan said, "A book is the only place in which you can examine a fragile thought without breaking it, or explore an explosive idea without fear it will go off in your face.  It is one of the few havens remaining where a man's mind can get both provocation and privacy."

But, it seems its value in this Digital Age is diminished, as the reading of books has been been on a steady decline for decades.  Back in 1998, <a href="">surveys in the U.K.</a> showed that "more than one in seven adults had not read a book in the last year and more than one in three has never visited their local library." A survey in 2007, found <a href="">one in four people read no books during a year</a>. Folks like Steve Jobs have even been <a href="">quoted</a> as saying  "people don't read anymore." 

But, as I previously noted, the amount of content that is created and consumed on a daily basis online continues to grow at an astronomical rate. So perhaps it is the format that is dead? Perhaps the book as we know it is too antiquated, explaining why it's not drawing our attention as it once did. There are simply too many types of information competing for our time. (Fortune had an interesting cover article on the Future of Reading that's worth checking out.)

This is one reason why the iPad has excited my attention - it could be a road back to the written word of books. As an author, I am interested in the idea that extra features or  updates to <a href="">my book </a> could keep the content current and readers engaged on the topic. It can go beyond the search and bookmarking features (which are very cool by the way) offered for smaller form factors, such as the iPhone, and really start to create a more dynamic and interactive book reading experience.

We have seen news sites incorporate video and other rich media applicaitons into their reporting and <a href="">students embed video clips </a>into their college application submissions, so it's not a stretch to think that we will soon be seeing commentary from the author or <a href="">upcoming webcasts or talks</a> on topics that relate to the book. I can imagine playing games or taking polls or viewing movie clips for the stories we are reading (though in my mind the movie version is often a shadow of what the imagination can conjure.)

I am for anything that will help us reinvent books and inspire a love of reading. Because without books, there is the fear that everyone will know just a little about everything, and have a good understanding about nothing; we will be experts in our lives, but leave thoughts, opinions and worlds outside our immediate needs unexplored. 

With new technologies, such as the iPad, I see that the future of books can be relevant and interactive, helping us once again get lost in a good story or cut through all the quick snippets of information to delve into something in a meaningful way. It has to be, because the content of books is what will help us sustain the deep thinking and in-depth analysis that is required to achieve those "aha" moments that revolutionize the way we live and are needed to solve our biggest problems.

10:14 am pdt          Comments

Super Bowl in the Digital Age

The revelry and rituals of Super Bowl Sunday seem to grow each year. The game takes on a life of its own, bringing unlikely viewers together on the couch to eat, commiserate and cheer for several hours.

It's because the Super Bowl is more than a game; even if you are not a sports fan there's the pregame show, national anthem, halftime show and let's not forget the advertisements that keep people watching.This year, a record number of people - Neilsen Co estimated 106.5 million - tuned in to watch the game from around the world. There are a lot of theories as to why it made viewership history (you can check out the <a href="">Wall Street Journal's take</a>), but I would like to suggest the expanded reach and interest in the game is due, in part, to the many ways in which it is integrated into our digital lives.

Technology is playing a critical role in sports, both improving the experience and extending the life of any particular event. In football (American), the players, teams and league use a broad array of technology to enhance the game. Fans can connect with their favorite teams through their online communities; they can play digital games as their favorite players and participate in Fantasy Football leagues with people from around the globe. All of which serve to increase the interest and affinity viewers have for the game, creating ties to players, organizations and the league that fuel multibillion dollar apparel and merchandising industries. 
<div style="border-top: thin gray solid; border-bottom:  thin gray solid; padding: 20px; margin: 20px 2px; width: 46em;"><a href=""><img style="float: left; border: none;padding-right: 10px;" src="" /></a>Sarah Sorensen is the author of <a href=""><strong>The Sustainable Network: The Accidental Answer for a Troubled Planet</strong></a>.<br /><br /><em>The Sustainable Network</em> demonstrates how we can tackle challenges, ranging from energy conservation to economic and social innovation, using the global network -- of which the public Internet is just one piece. This book demystifies the power of the network, and issues a strong call to action.<br clear="left"></div><br />
In addition, technology can be found throughout football's operations, from the scouting teams to the post-game analysis. Just think of the wealth of information these players and coaches have at their fingertips that can be linked and analyzed a hundred different ways to try to increase competitiveness and gain a mental edge in the game. There are even sensors embedded in the helmets that wirelessly transmit impact data on hits to the head (up to 2000 a year for some players!) to the sidelines to help team doctors monitor the players as they run up and down the field. The list goes on...

Then there is the Super Bowl - the crowning jewel of the season - it dominates all types of conversations for weeks if you count all the before and after game/event analysis, and the reality is that many of those dialogues are taking place online. The rich media experiences that are now an integral part of the event create opportunities for businesses and brands to connect and develop relationships with their target audiences. It's the online chatter and buzz, with friends and fans sharing the information and resources that are most relevant to their groups, that are driving sustainable revenue opportunities and mindshare. 

In case you missed anything during the game, you can easily go online and get play-by-play coverage, as well as play-by-play commentary. You can watch and review virtually everything to do with the game, from the amazing catches to the half time show.  You can <a href="">vote for your favorite commercials </a>, as fan favorites get a viral marketing life that helps support the business case for spending millions for a 30 second TV spot.

Some advertisers are<a href=""> skipping the TV </a>altogether, going straight for interactive social media campaigns. This year, Pepsi, a traditionally stalwart Super Bowl advertiser (spending $142 million on 10 Super Bowl spots over the last 10 years), opted out in favor of using Facebook, Twitter, Ustream and iPhone apps to reach out and try to engage customers with their <a href="">"Refresh Everything"</a> campaign. A strategy that seems to be working for them - Neilsen Co reported that PepsiCo got 21.6 percent of the chatter about Super Bowl advertisers over the last two months - way more than their rival, Coca-Cola, received.

And don't forget the money games around the big game </a>- namely the <a href="">betting industry </a>that pulls in big bucks by enticing people to bet on virtually anything, and I do mean anything, related to the game. What influence will technology have? Well, soon, if <a href="">Cantor Gaming </a>has its way, gamblers won't be relegated to sitting at the sports book to place bets, they will be able to do it from anywhere on the casino's premise and will have access to real-time odds. (Actually, if they had their way, you would be able to do it from your mobile phone!)

There is also the money around merchandising for the big game, which has taken on many new dimensions, as retailers scour blogs, chat rooms and Google searches to try to identify where fan loyalties lie and then use the Internet to reach out to those fans to sell them team merchandise and memorabilia (<a href="">check out an interesting article in the New York Times</a>), filling a gap and extending the reach of typically regional retail coverage.
So, while I watched the game yesterday, I was also watching all the activity around the game and thinking about what the future will bring.  CBS didn't get its way and the NFL didn't allow <a href="">the game to be streamed live in its entirety</a>online, but it is inevitable. And when that happens, it will add yet another dimension to the game. In short, we are just starting to tap into the opportunities presented by the big game and can expect entertainment events, such as the Super Bowl, in the digital age to get bigger and the reach broader year after year.

9:54 am pdt          Comments

Why is Simple Soooo Not Simple?

I have been disconnected, without a working computer for a day and a half! You are probably wondering, "how did that happen?" "how did you survive?" "what did you do?" and honestly, I hardly know. It's been a blur. But one thing is crystal clear - a simple upgrade is ANYTHING but simple!

Based on the recommendation of a couple of friends, who had just gotten new computers and were talking up some of the useability features of the Windows 7 operating system, I sat down at my computer and decided I would do the upgrade from XP. The upgrade packet had been sitting on my desk for the last couple weeks and I decided it was time to commit.

Little did I know what I was committing to! Like many a blind date, where you hold out hope for Mr. Right, but open the door to a guy wearing too tight pants and smelling slightly of dirty socks, I found myself facing a situation fraught with mind-numbing discourse and disappointment. I had tried to do everything right - I had backed up all my files, I had all the software ready to load, I had all the product keys in hand - I was feeling good, maybe even a little cocky! Then I opened the DVD drive, and just like opening the door for that blind date, it was all downhill from there.

<div style="border-top: thin gray solid; border-bottom:  thin gray solid; padding: 20px; margin: 20px 2px; width: 46em;"><a href=""><img style="float: left; border: none;padding-right: 10px;" src="" /></a>Sarah Sorensen is the author of <a href="">The Sustainable Network: The Accidental Answer for a Troubled Planet</a>.<br /><br />The Sustainable Network demonstrates how we can tackle challenges, ranging from energy conservation to economic and social innovation, using the global network -- of which the public Internet is just one piece. This book demystifies the power of the network, and issues a strong call to action.<br /><br clear="left"></div><br />Time stood still - only it didn't and I lost a day and a half of productivity! That's a lot for anyone. The Strategy Group conducted a study a couple years ago where more than 32% of respondents (representing companies with 100 or more employees) stated they had zero tolerance for network downtime. They estimated the average cost incurred when something went wrong with the network was $3 million per day, with 10% of the group estimating it would likely cost them more than $10 million in damages and lost revenue per day. Infonetics Research estimated that large businesses lose an average of 3.6 percent in annual revenue due to network downtime each year.

On my own small scale I could relate - I felt the pain. If Windows 7 buys me an extra 10 minutes a day of productivity, due to it's ease of use, I am going to still need 72 business days to get that time back! So what did I do wrong?

I consider myself a reasonably intelligent person. I am fairly technically conversant - I have even passed a few IT/networking certification courses. I can follow instructions and have basic common sense. (I feel a need to include these last attributes to ease the minds of the support folks who asked me questions like "are you sure it's turned on?" or "are any of the lights blinking?") So, why couldn't I get my computer,applications and network up and running in a reasonable amount of time?

I am not trying to shift blame, but I don't think it is me. And I don't think it's specific to any one particular OS. I think it is the overarching complexity associated with all the software and hardware that we increasingly relying on to run our lives, businesses and governments. Think of all the different vendors that make up our extended technology ecosystem - oh, and don't forget the open source folks. Then think of all the different products each one offers and all the different versions of each of those products that exist out there. One change to one of those things is enough to throw everything else out of whack. It's enough to make your head spin and start some serious finger pointing.

<blockquote>Specifically, I heard, "sorry, it's not the hardware, that's a software issue," "those applications are compatible, but not those versions," "yes, we sold you that package and it did include that application, but we can't do anything (unless you want to pay us $$$), so you will have to talk to the individual application vendor to get a specific solution..."</blockquote>

Each individual application or services is working on being "simple to use," but when you put them all together they don't always play nice. Anyone in IT will tell you that while everything is "interoperable" it doesn't mean its going to work together, at least right away. Which explains, why 70% of IT's time is spent on simply keeping things going; simply keeping up with the changes that occur during regular course of business, along with necessary patches and security upgrades, to make sure everything is working. There has to be an easier way!

Is it a pie in the sky dream to wish that vendors would come together and truly provide solutions with a simple evolution path that makes it easy for anyone, including me, to upgrade my system? Are there simply too many vendors? Or is it that things are changing too quickly? Will it be something else entirely that will bring us simplicity? Should we be focused on using hosted or managed services in the cloud to take much of the complexity out of the hands of end users? What are your thoughts? I would love to know.

I have faith that simplicity is on the horizon because it has to be... It's the only way we will get what we need from our technological resources to sustain innovation, efficiencies and meaningful change on a worldwide scale. It has to be simple for everyone, so everyone can use the resources and take part. The alternatives, like Mr. Wrong, are just not palatable.

9:53 am pdt          Comments

Obama's Year in Technology

As President Obama prepares to deliver his State of the Union speech after a year in office, I thought it would be a good time to look back on the Administration's technology agenda. As I mention in my <a href="">book</a>, Presidential Candidate Obama was really the first to leverage technology in a meaningful way during his campaign, giving us glimpses into how the political process can be engaged and enabled by a savvy social media and online strategy.  So, when the Obama Administration took office, it was natural to assume that it would be bringing the White House into the Digital Age. 

After all, Obama was a candidate who got it - he understood that the foundation for improving the prospects of our children and strengthening our long term economic prosperity lay in our access to and use of technology. As he said in a <a href="">campaign speech</a>:

"Let us be the generation that reshapes our economy to compete in the digital age. Let's set high standards for our schools and give them the resources they need to succeed. Let's recruit a new army of teachers, and give them better pay and more support in exchange for more accountability. Let's make college more affordable, and let's invest in scientific research, and let's lay down broadband lines through the heart of inner cities and rural towns all across America."

<div style="border-top: thin gray solid; border-bottom:  thin gray solid; padding: 20px; margin: 20px 2px; width: 46em;"><a href=""><img style="float: left; border: none;padding-right: 10px;" src="" /></a>Sarah Sorensen is the author of <a href="">The Sustainable Network: The Accidental Answer for a Troubled Planet</a>.<br /><br />The Sustainable Network demonstrates how we can tackle challenges, ranging from energy conservation to economic and social innovation, using the global network -- of which the public Internet is just one piece. This book demystifies the power of the network, and issues a strong call to action.<br /><br clear="left"></div><br />
However, we saw glimmers of how difficult a transition into the Digital Age could be. Right off the bat there were discussions around whether a <a href=";txt">U.S. President could use a Blackberry </a>to stay in touch. This singular issue was a clear indicator of how far behind the White House actually was in its use of technology (and how vulnerable our mobile devices and digital infrastructure are). 

I think the extent of the task was captured in a <a href="">Washington Post </a> article that described what it was like for the Obama Administration when they took their offices in the White House - can you imagine walking into your office and having to try to connect your landline???  So, considering the starting point, I think the Administration can feel confident they have made significant progress.

There have been some monumental firsts, such as the first U.S. Chief Technology Officer (CTO) - Aneesh Chopra - and the first U.S. Chief Information Officer (CIO) - Vivek Kundra. There was the <a href="">First Presidential Online Chat </a> and the first foray into greater transparency with a <a href="">U.S. Federal IT dashboard</a>, which started to provide visibility into where the money in the government's budget goes. (Note, this dashboard was launched in just 6 weeks showing that even big government can get things done, particularly when using technology well!) Government agencies started using social media sites, <a href="">such as Twitter</a>, to help people stay up to date on events and emergency situations.

There have been investments designed to extend broadband access to more people and places. <a href="">A total of $7.2 billion pledged through the Recovery Act broadband program </a>will enable more people to connect to the resources and information of the network to improve their opportunities and participate in the global economy.

But there have also been some snafus. For instance, we have seen how hard it is to walk the line of security and transparency.  Remember the <a href="">TSA Security Breach </a>that posted all the airport screening procedures, otherwise known as a good "how to" manual for terrorists?

And there have been some downright scares that remind us of the vulnerabilities of our networks. A <a href="">denial of service attack </a>took down the U.S. government's Department of Homeland Security, Federal Trade Commission, and Treasury Department's web sites; and, of course, there is the <a href="">recent hacker activity on Google </a>and other prominent companies. These incidents serve as a reminder that the Administration needs to balance preserving individual rights in the digital world, with increasing the overall security of the connections. We have seen U.S. Secretary of State Hillary Clinton speak out against online censorship and can assume the just appointed Cybersecurity Coordinator Howard Schmidt will be leading the Adminstration's stand on cybersecurity.

It's important to remember that some of the activities the Administration has tackled this year are purely housekeeping, laying the fundamental groundwork that will help the government move forward more effectively in the future. For instance, there are the mundane, but very important projects of ensuring <a href="">White House e-mails </a>are appropriately catalogued, archived and backed-up. (The goal is to also ensure there is an auditable record of all e-mail activity and measures in place to ensure only authorized individuals can access the database and alerts are raised when someone tries to delete anything.) Or <a href="">developing a plan </a>that will help standardize and provide a common information technology infrastructure for government that can reduce costs and ensure greater consistency, visibility and security long term.   

But it has been encouraging to see the government innovate and try new things, such as moving into the <a href="">cloud</a>.  If the <a href="">lumbering Census process </a>can benefit from the efficiencies of the Cloud, chances are there are many other applications and benefits. 

The use of all these technologies can foster opportunities, innovation, and long-term economic viability; it can pave the way for more effective service delivery and greater transparency to increase the dialogue and strengthen the relationships citizens have with their government. I think the Administration, while it has a long way to go, is definitely on the right track when it comes to technology.

9:53 am pdt          Comments

Google's Fall Out With China - Making a Stand for Free Speech

"A man who has committed a mistake and doesn't correct it, is committing another mistake." Confucius, Chinese teacher, philosopher and political theorist, 551-479 BC

Time and time again, China has tested the digital world, trying to stifle its free information flow and control the resources that are open to its people. There are a long list of methods China has employed to clamp down on access. They have used a variety of technological tricks, some of which we know about and many of which we never will, and some good old-fashioned coercion measures (from fines to imprisonment) designed to pressure content owners to keep content in line with what they deem acceptable. For example, in 2008, the year the summer Olympics took place in China, it was discovered that China had been monitoring Skype communications and a handful of bloggers whose commentary was unfavorable to China during the Olympics were detained. (Probably not so coincidentally, they were released and their blog postings removed only a little while later.)

China employs thousands of government workers in these efforts, and, to date, have been fairly successful in achieving the results they desire. It seems when faced with the potential entices of the Chinese market, businesses have found themselves in some precarious positions and made some, in my opinion, dubious calls, in efforts to comply with Chinese requirements.

For instance, when Google opened up shop in China, they agreed to censor some of their search results. Yahoo was questioned by Congress, in 2007, for turning over e-mails that led to the imprisonment of Chinese dissidents. In 2008, <a href="">Cisco Systems was also questioned by Congress </a> after it was suggested, due to a Cisco sales presentation that surfaced, that they were potentially helping the Chinese government modify their networking equipment to block and censor Internet traffic (it should be noted it was an accusation they Cisco vehemently denied). YouTube has found its service shut down several times; presumably to avoid any glimpses of content that China deemed unacceptable. (Probably not surprising, the last shut down lasted through the anniversary of the Tiananmen Square massacre, along with the blocking of Twitter.) 

In 2009, the Chinese government issued a directive that would have required the installation of filtering software, nicknamed <a href="">"Green Dam," </a>on every personal computer (PC) sold in the Chinese market. Almost comically, they proposed this requirement under the auspices of protecting children from harmful Internet content. It was sharply criticized by governments around the world on a variety of fronts, from free speech impingement to potential security compromises to free-trade violations. This is due to the reality that if loaded onto every PC, it would give the Chinese government unprecedented control over an individual's personal computing use. While <a href="">China backed-off of its deadline</a> (July 1, 2009) for implementation, in the face of pressure from Chinese computer users, computer manufacturers, and governments, it's evident they have not been rethinking their overall objectives - to control their citizen's online access.

But it seems the proverbial straw that broke the camels back occurred last week for Google. Taken from the <a href="">blog </a>of Google SVP, Corporate Development and Chief Legal Officer, David Drummond</a>, they had identified a "highly sophisticated and targeted attack on our corporate infrastructure originating from China that resulted in the theft of intellectual property from Google." After further investigation they found it was part of a wider attack designed to access the Gmail accounts of Chinese human rights activists. (An good disection of the attacks can be found <a href="">here</a>)They have since "discovered that the accounts of dozens of US-, China- and Europe-based Gmail users who are advocates of human rights in China appear to have been routinely accessed by third parties," which goes to the heart of a much bigger global debate about freedom of speech."

As of right now, there is no international standard, nor universal agreement on what is acceptable or not in terms of free speech in the digital world; we are all treading in un-chartered waters. There's the United Nations Universal Declaration of Human Rights, which was drafted in 1948 and provides a basic framework, but little practical guidance in this Digital Information Age. And declarations, such as <a href="">The Global Network Initiative (GNI), </a>while noble in intent, have provided very few specifics and virtually no repurcussions for abuses. 

But the threat to freedom of speech in the digital world is very real. As I have mentioned in <a href="">previous blogs</a>, questionable restrictions on the network can lead to potential fettering of its possibilities and major encroachments on individual personal freedoms. It's a very slippery slope.

So, I want to applaud Google for making a stand and drawing a line. They announced, "We have decided we are no longer willing to continue censoring our results on, and so over the next few weeks we will be discussing with the Chinese government the basis on which we could operate an unfiltered search engine within the law, if at all. We recognize that this may well mean having to shut down, and potentially our offices in China."

While we still have to see what will come of this proclamation, but the fact they have said they are willing to walk away represents a clear departure from trying to conduct business as usual. The Obama administration has since issued a statement of support for Google and reiterated Internet freedom as a priority. So, while we may not see a huge sea change right away, this represents a step in the right direction and has reignited a much needed debate around personal freedoms. It sends a message that it is not okay to simply work within the confines of China's increasingly restrictive rules and hopefully it will improve the willingness of China and other governments to work more with foreign companies and governments on these issues.

Everyone should be able to participate and be heard; the right of free speech is an ideal we need to fight for in the digital world, and it starts with everyone having the right to freely connect to the unfettered information of the network. This latest attack should serve as a wake up call for companies, policy makers and governments around the world to be more bold and work to protect and improve the rights and opportunities of citizens everywhere.

9:52 am pdt          Comments

2018.01.01 | 2013.04.01 | 2010.04.01 | 2009.12.01 | 2009.10.01 | 2009.09.01 | 2009.07.01

Link to web log's RSS file

Enter content here

Enter content here

Enter content here

Enter content here

Enter supporting content here

Sorensen Consulting | | Phone: (650) 305-0598

Powered by