Monthly Archives: October 2015

Optimism Bias: How Your Half Full Glass Leaves You Vulnerable

Photo of a woman holding a large happy face sign in front of her face.There is a cognitive phenomenon called the optimism bias that leads humans to think that the likelihood of a negative event is lower than it really is. This is great when we are battling the day-to-day stresses of our lives but not so good when trying to plan for unexpected risks. In this blog post I will explore how the optimism bias can affect risk management.

Tigger or Eeyore

In a recent article for the Nonprofit Risk Management Center, Erin Gloeckner describes different personality types as Tiggers, or people who are always positive and bouncy, and Eeyores, those who are always down and negative. In reality, most of us fall somewhere in between but tend to have an optimism bias. As I have mentioned in previous posts, I tend to be an uber-Tigger and that can get me into trouble when determining the likelihood of failure or disaster. I was once asked to develop potential disaster scenarios for a project so that I could mitigate any risks associated with those scenarios. Try as I might, I could not come up with any realistic scenarios that involved failures. I realized my own bias towards optimism and asked for help from a project member I knew had a negative bias. That person was able to develop many different disaster scenarios and we created risk mitigation plans to counter each of them. True to my world, none of those scenarios ever happened but we were prepared nonetheless.

Business Planning

When managing information, it’s important to have a realistic sense of security. Start with preparing honest answers to these questions:

  • What are the chances of a security breach that leads to leaked confidential information?
  • What are the chances of a natural disaster that affects the operations of my organization?
  • What are the chances that I will lose a key person in my organization, at least temporarily?

It is important to have plans in place to counter the various threats that can happen in the course of doing business. Storms don’t stay away forever and key people don’t stay in one position their whole life. We can lessen the impact of these events by planning for them.

Personal Planning

I have talked about this in past blog posts but I think it is also important to evaluate potential risks in our personal lives. Ask yourself:

  • What are the chances that I could lose my current job?
  • What are the chances that I could suffer health problems?
  • What are the chances that a natural disaster could affect me or my family?

While it is not good to dwell on these scenarios to the point of distraction, it will give you peace of mind to know that you have planned to mitigate risks. These mitigation strategies should include making sure your skills and education are up to date and that you are exercising in order to fend off avoidable health problems. Set aside money to counter any unforeseen financial problems. Just as you plan for business disruptions, you can also plan for personal issues. These plans can help you sleep at night and be a Tigger all day.

Thoughts

If you are interested in learning more about the optimism bias, there is an excellent 2012 TED talk by Talit Sharot that covers the topic. Whether you tend to be an Eeyore or a Tigger, it is important to recognize your biases as you make plans for your business and your life. Do you already know your personal biases? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Will Computer Science Displace Classic Education?

Photo of 4 elementary school children typing at desktop computers.I believe that technology is now a routine part of our lives and I have been thinking lately about how much effort we should spend educating young students about computers. I read an article that highlighted a push to make computer science mandatory in German schools. My question is, has technology become so commonplace that we treat it like running water and electricity, or can it still provide a competitive advantage for a community or a nation?

Keeping up on Technology

One of the concerns of German lawmakers, which is shared by officials from other countries, is that their students will fall behind and not be able to fill future technology jobs. According to the head of German digital industry group Bitkom:

“IT skills are now as important as the basics. Digitisation determines our everyday lives more and more, for leisure time as well as for work. Schools must teach about media literacy beyond the classroom and give students a firm grasp of IT technologies.”

Suddenly, the tech kids are the cool ones in school. This follows the recent emphasis in schools in science, technology, engineering, and math (STEM). The theory is that partly because of the proliferation of technology, the best and most advanced jobs will go to those who are trained in those areas.

Code.org

In a blog post last year I highlighted the organization Code.org that believes that “every student in every school should have the opportunity to learn computer science.” They are working to increase access to computer curriculum, particularly for women and students of color. Just as the lawmakers in Germany are advocating, Code.org believes that computer science should be part of core curriculum in schools alongside biology, algebra, and chemistry. While I agree that computer science is important as part of a STEM curriculum, I wonder which classes we should drop to make room for it?

Curriculum Replacement

A recent PBS article highlighted a similar push to introduce coding courses in schools in Australia. Computer science curriculum, according to the article, will replace geography and history courses. I am sure that the change will generate a lot of debate around the virtues of a classic education versus a more modern education. It leaves the door open for ongoing conversations around curriculum mix and what students actually need to succeed in the future.

Thoughts

To circle back to my original question, is it necessary to add specific computer science curriculum to schools? Or has technology become so pervasive that everyone knows how to use it, but only a few need to be able to create new and unique applications? In the same vein, should we also introduce mandatory physics courses as well to better understand the underlying hardware? Finally, which courses would you replace? As you look back on your education and career, which classes have shaped you the most and why? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Trends in Higher Education

Stock photo of a hand about to click Go when searching with the term University.The Boston Consulting Group published an article recently that highlighted trends in higher education. This piece did a good job covering those that are emerging. I want to examine the convergence of several of them and how I think technology will play a part in shaping that future.

Funding

State colleges and universities have long relied on government subsidies to keep tuition at a manageable rate and fund all of the research and activities associated with the school. In recent years the amount of funding coming from the states has dropped as they struggle to balance their own budgets. The shortfall is made up through increased tuition and grants as well as targeted campaigns aimed at private and corporate donors. Increased tuition is problematic due to the large debt graduates are accumulating. A recent article in U.S. News & World Report detailed how some graduates are carrying student loan debt into their forties, which means they cannot help their children start academic careers. The result is that the children are assuming their own debt, which continues the cycle. Generating alternative funding sources or containing operational costs could help break that cycle.

Competition

There are more education options available to students. Schools across the country, and even some international schools, are offering attractive incentives to reel in young scholars who might otherwise attend their state university. There’s also been a spike in online curriculum and for-profit schools. In this competitive environment universities must target the right prospective students and then lure them in. With the drop in state funding mentioned above, many universities are pursuing more international students, who pay a higher tuition. All of this requires a targeted, intelligent marketing campaign.

Increased Research

Partnerships with private industry are helping universities increase their research efforts. These partners provide funds for sophisticated research, the results of which can be licensed back to the partner or sold outright. Top-notch students and faculty are drawn to such projects, industry gains new business ideas and opportunities, and students and potential employers are brought together.

Thoughts

Colleges and universities are facing pressure from increased competition, uncertain funding, and the push to accelerate and capitalize on research. Here are ways that I think technology can help alleviate that pressure:

  • Social Media. Universities are increasing their use of social media to reach a tech savvy generation from around the globe. Advances in web and media technologies as well as analytics help schools target the right audiences and markets.
  • Big Data and Business Analytics. The ability to quickly analyze large amounts of prospective student data helps colleges narrow their search for potential students. By identifying and targeting particular demographics, schools can reduce marketing costs and increase the efficiency of their search campaigns.
  • Collaboration Software. Research partnerships are no longer just with the company down the street. Partners can be thousands of miles away so it is important that schools and private enterprises can communicate, catalog and analyze research results in a systematic and predictable way. Collaboration applications can help keep researchers informed and successful.

While colleges and universities are facing funding and competition pressures, there are technologies that can help lessen those concerns and lead to new knowledge and discoveries. I am hoping this post spurs your thoughts on other ways that technology can or is helping higher education.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

The Double Edged Sword of Information Availability

Photo of man using a smart phone in front of a computer.I recently came across the Harvard Genome Project. For the project, a team of Harvard researchers are collecting personal genome information to share with researchers who hope to create breakthroughs in disease eradication and prevention. It struck me that with our ability to share information and make it available to different groups, either intentionally or unintentionally, we have created a double-edged sword. On the one hand, with technology we have greatly expanded research opportunities and created the infrastructure to track down long lost relatives. On the other hand, our privacy may be jeopardized if that research information falls into the wrong hands or if a long lost relative prefers to stay lost. Is the genie out of the bottle, or are we still in control of the exabytes of information in the cloud, some of it personal?

Research for a Brighter Tomorrow

The Internet that we know today was born as the ARPANET under a contract to the United States Advanced Research Projects Agency. Its original intent was to connect research facilities to share information. In December 1969, Stanford University, University of California Santa Barbara, University of California Los Angeles, and the University of Utah were connected to collaborate and advance research. By 1971, several other prominent universities, private research firms, and government agencies had joined ARPANET, extending the geographical reach well beyond the southwestern U.S. The original Internet was intended to further scientific research, not to share cat videos. In that vein, the Harvard project exemplifies the positive aspects of information sharing.

Technology and Democracy

Before we were all connected by technology, there was radio and television, which are “one to many” media. One broadcast, such as the nightly news or a presidential fireside chat, went out to those who chose to listen or watch. There was no way to give feedback or to refute what might be misinformation. Now people around the world can share real time information on developing stories; we no longer have to wait until the five o’clock news or place complete trust in the newscaster.

We can also take on the role of broadcaster. We can participate more deeply in the democratic process by speaking out on issues of the day and join with others to have an impact on legislation that affects our lives. Whether we live in the safety of the U.S. or in a war ravaged country, we have a voice and it can be heard, thanks to technology.

The downside is the ability to spread misinformation. It is important that we choose carefully the news sources that we trust. The Onion has made a sport of parodying trending news but their articles are sometimes quoted as facts. It is up to each one of us to distinguish truth from fiction.

The Privacy Issue

I wrote a blog in July highlighting the breach of private information submitted to the website Ashley Madison. Users expected their personal information to remain private, but hackers who broke into the site published that information. This is where I wonder if the genie is out of the bottle and any information we choose to share, be it our genome data, private photos, our current location, or politically sensitive information, should be considered potentially public. Would we conduct ourselves online differently if we expected our information to go public? Would we be more careful?

Thoughts

Technology advances have allowed us to share research, information, product reviews, political news, or even to find each other. I believe though that with this new power and connectivity comes responsibility that we sometimes take lightly. We need to approach this new world with eyes wide open. Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.