Tag Archives: IT

The Changing Role of the CIO

In our Information Systems and Management course we talk about the future of the office of chief information officer (CIO). Some argue that the title and office will disappear completely within the next five years. Others argue that when IT becomes embedded in every function of an organization, the CIO will be the most important person. Some argue that the CIO and the CEO role will merge. This question is and has been a moving target for years, but we can look at history and trends to get a good gauge as to the future.

History

In the early days of computing, one of the primary functions was to process and analyze financial data. Therefore, it was logical that the head technology person report to the controller or head financial officer.  Unfortunately, this trend continued long after the technology functions diversified into almost all areas of the organization. The CIO/CFO relationship has surged again in recent years with a Gartner survey last month reporting that 39 percent of surveyed IT organizations again report up through the CFO.

Trends

Technology has become pervasive throughout modern organizations and the IT function has gained views and responsibilities within all corners of the operation. The CIO is challenged to work not only with line of business executives but also the chief financial officer, the chief executive officer, and even the chief marketing officer. In a recent interview, the current CIO of Clorox and the former CIO of Pabst talked about their relationship with other organizations and how outside organizations are driving much of the IT spend and project mix.

As IT hardware and software becomes increasingly user friendly and data is being pushed more and more to the cloud, IT organizations will definitely have to reinvent themselves. They don’t “own” as many things but their influence is broader than it has ever been. The need to rise above the technology and help create business solutions is more critical now than ever before. Is your IT organization mature enough to fill that role in the future?

Thoughts

Last month AlixPartners’ blog post reiterated research first presented in the 2011 Harvard Business Review (HBR) blog on the future of the CIO.

In the HBR blog, the author contends that the CIO will be split up into four unequal quadrants:

  • – chief integration officer
  • – chief innovation officer
  • – chief infrastructure officer
  • – chief intelligence officer

For my money, I believe that it will only bifurcate into operations (chief operations officer?) and technology engineering (R&D?). The bulk of the resources will go towards operations, but that percentage will change as we move more towards distributed cloud computing and software as a service. The CIO (or new title) and IT organization will become even more valuable as they work on system and business function integration at a higher level.

What do you think the future holds for your CIO? Will there even be a designated role for this or will it be disbursed amongst other titles? Will they be more important or less important in the organization? Let me know your thoughts.

 

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT topics that keep him up at night.

The Consumerization of IT

In a recent post on TrendMicro blog, Cesare Garlati likens the IT consumerization trend to an iceberg. The visible evidence of personal devices being brought to work (i.e tablets and smartphones) is only 10 percent of the problem. The other 90 percent of the problem lies under the surface and represents the hidden problems of company data leaving the company and potential viruses coming into the environment. The lines between consumer devices and work devices have blurred significantly over the last ten years but as IT professionals we often have not kept up on the problem of security. That security extends to our infrastructure and our networks.

History

In the early days of computing, there were no personal computers except for maybe the do it yourself Heathkit. Once personal computers came into fashion, there was minimal networking available, so it was a stand-alone device that transferred data back and forth with disks. As networking became more mature, we worked our way through dial-up modems, LAN cables, and then finally wireless networks which are fast becoming ubiquitous. The differences between a consumer device and a work device are quickly disappearing. Is your organization ready for this new reality?

Devices

As mentioned above, devices have essentially become smaller and much more sophisticated over the last thirty to forty years, accelerating in the last ten years. Often, employees are asked to carry a device for work so that they can check on work status or to keep in contact with customers and vendors. Increasingly, these are handheld devices, often a smartphone. Where is the line between a company device and a personal device? Applications increasingly have web interfaces so why can’t a person use their personal smartphone to access customer data and then download the latest version of Angry Birds? In the future, as devices continue to become smaller, an astute IT worker won’t even be able to tell when a consumer device comes in the door.

Networks

Networks today are becoming ubiquitous and increasingly user friendly. With the advent of 4G networks and widespread wi-fi, many are connected 24/7, no matter where they go. In a recent article, a partnership between Google and Raven Industries is set to launch helium balloons equipped with network equipment to provide connectivity to rural areas in the US and particularly in developing countries. The combination of smaller consumer devices and ubiquitous Internet connectivity is destroying the old command and control mentality of IT departments. No longer do they have the luxury of denying access to a particular device or class of devices. The prudent IT group will work to mitigate any risks involved in unsecured devices and work to educate employees.

Thoughts

Some organizations are now giving a stipend to employees to purchase their own computer. This of course makes it harder to maintain patch images for every make and model under the sun but, if executed correctly, IT does have a say in the security components that are installed.

How does your organization handle consumer devices in the work place? Do you embrace them, tolerate them, or fear them? Let me know your thoughts.

 

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT topics that keep him up at night.

The Marriage of Art, Copy, and Code

In our current Information Design and Communication course we are talking about infographics and how they convey information differently than pure print or pure graphics. They take the best of both worlds and hopefully reach a mixed audience of people that are visually oriented or linear sequential (left to right, top to bottom). I have been thinking lately about how infographics can become animated or even interactive. This is already starting to happen in terms of self-directed information graphics. I have also been thinking about how this will creep into advertising and how we can create more personalized advertising. I recently viewed a video at redsharknews.com that gave me a glimpse into the future: the marriage of art, copy, and code.

Art

It used to be that art was very static and very tangible. Whether it be a fine painting or a sculpture, it is permanent and meant to be viewed by many people many times. Art is becoming more digital and more dynamic. With increasing screen resolution, images are more vibrant than those on a static canvas. Digital can also mean temporary, whether by design or by accident (forgot to back up). This new medium is increasingly being used in print and dynamic advertising and is very effective in communicating the message.

Copy

Someone still has to write copy for all of the advertising. In the age of social media, people are looking for concise information and advertising that breaks through the chatter and informs. Consumers are becoming more sophisticated and in many cases, more jaded. It does and will take a very talented copywriter to craft the script for future advertising. The same advertisement may be seen on a television, a computer, a handheld device, or other devices. How do you craft a story for all of those potential viewers, or do they each get their own custom version?

Code

Here is where it gets interesting. Because of the dynamic nature of art and copy and a new sophisticated audience, it takes a skilled software person to knit it all together and make it personable, relevant, and timely. As in the example I shared above, the ad needs to be about you, where you live, what interests you have, and what possible connection you might have to the advertised product. It’s about me, here, and now.

Thoughts

In the future, will the same person possess all of these skills or will it continue to be a team effort? Is it possible to have art skills, copy skills, and coding skills in one package? Are we training upcoming professionals in all of these areas or at least to be aware of the other professionals that they will be working with? It will take some skillful teamwork to pull this off but, with the right collaboration, it can be real magic.

 

About Kelly BrownAuthor Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT topics that keep him up at night.

The Dark Side of IT

There has been a lot in the news lately about spying and the associated technologies used to aid said spying. Because of a leak by a contractor, it has been revealed that the US National Security Agency (NSA) has used a number of different technologies, including e-mail and phone surveillance, to spy on enemies of the state as well as regular citizenry identified as potential terrorists.

Technologies

In a recent New York Times post, author Vikas Bajaj suggests that “consumers have traded convenience for privacy”. We have the technology already to track the Internet activity of an individual. This includes e-mail archives and digital phone records, including conversations. With the advent of digital consumer technology, storing 1’s and 0’s is easy and increasingly more affordable with efficient data storage. The tools around big data make it easier to sort and pinpoint a particular thread. It is easy to capture, easy to store, and easy to sort. As an Internet consumer, is there more that we should know about these tools to be informed of our privacy and dealings?

Responsibility

When it comes to digital surveillance, what is our responsibility as a consumer? What is our responsibility as an IT practitioner? As a consumer of all things digital, I think it is our responsibility to understand the extent of which our presence is being tracked and understand that our activity on the Internet is not as private as we think. Think before you share all of your deepest, darkest secrets on Facebook. The old adage applies—“never do anything you wouldn’t want your mom to read about in the morning paper.” As IT practitioners, we may be called upon to gather data or turn over records to comply with a subpoena or court order. It is our responsibility to understand to what extent our customers and employees are protected in terms of privacy. Do you understand your company’s privacy policies? Are your customers and their records protected to some extent?

Solutions

The first solution is mentioned above and that is: be a smart consumer. Understand your presence on the Internet. Understand which sites provide a basic level of security and understand how your information moves about the Internet. The second is to understand and employ encryption techniques. This is especially important when handling customer personally identifiable information or PII. Make sure that this data is encrypted within your systems and while traveling across the network. Keep your own personal information secure and encrypted as well. Also, as an IT professional and a citizen of the cloud, you need to understand some of the techniques for preserving data such as private networks and private cloud computing.

Thoughts

Be aware before you share. Of course, all of the technology in the world is not going to stop your information from being extracted via a court order and, hopefully, you are never in that situation. For us upstanding citizens, it is imperative that we know how we are protected and how private and confidential our conversations and data really are or are not.

Do you stop to think about your privacy? Let me know your thoughts.

 

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT topics that keep him up at night.

Bridging the Technology Gender Gap

There has been a big emphasis over the last few years on STEM (science, technology, engineering, and math). Education initiatives are pushing STEM in elementary, middle, and high schools. With this emphasis we should see more young men and women entering college degree programs and careers in these fields. Will this push help to reverse the decline of women entering into the information technology field? Time will tell, but I have a few ideas for narrowing this gap.

Statistics

According to a recent study by the National Center for Women In Technology, 57 percent of professional jobs in the US are held by women but only 26 percent of professional computing occupations were held by women. According to the same survey, only 18 percent of Computer Science and Information Science undergraduate degree recipients were women. The trend for women in technology appears to be getting worse and not better.

Ideas

In a recent blog post, Jaleh Bisharat, vice president of marketing at oDesk, suggests three things that may invite more women into the technology and communications field:

  1. Make computer programming a requirement for graduating from high school.
  2. Aggressively combat the stereotypes of computer scientists.
  3. Expose the creativity involved in advanced math and science.

Her premise is that if we demystify information technology by exposing young people, male and female, to areas such as programming then they will begin to understand that tech jobs can be rewarding. The tech industry needs to shed its “nerdy” image in order to be considered a viable option for young women. As Ms. Bisharat points out, programming can be poetry and it is very much a creative field.

Thoughts

Here are some things I have been thinking about to attract more young women to STEM and keep them interested enough to pursue a degree and a career in technology or engineering:

  1. Bring more girls in contact with technology professionals, even as early as elementary school.
  2. Create better marketing by the technology industry to attract more young women to the industry.
  3. Make math hip by highlighting top-of-the-line applications!

If we are successful in introducing young people to technology and information professionals, they will understand that these are the people that help bring new devices and applications to life. In turn, the professionals can help reinforce the notion that math and science are cool and they are not limited to one gender. Finally, we need to do a much better job of marketing the technology industry. We have the Beef Council, why not a technology council, complete with a tagline and a jingle and a captivating app? Come and join us and help us invent the future! All of these efforts could help narrow the current gender gap in technology jobs and help us to employ the talents of creative men AND women.

Do you have other ideas for attracting talented women into the technology field? What do you think are our biggest barriers? Let me know your thoughts.

 

About Kelly BrownAuthor Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT topics that keep him up at night.

Looking Through the Google Glass: Trends in IT

Trashed computer hardwareI have been thinking lately about trends in IT and specifically about Google Glass. The prototypes are out now with full introduction expected in 2014. I think that the introduction of Google Glass and other alternative computing platforms and applications points to a world of much smaller computing devices and the separation of the client and the processor/data storage.

History

Throughout history, each new computer model has generally been smaller, more powerful and came with a friendlier, more intuitive interface. Compare today’s smartphone with the room-sized ENIAC computer of the early 1950s. It is smaller, friendlier, and far more powerful. We came from a room-sized computer to a computer that you hold in your hand within fifty years. How far can we take this paradigm? What does the future hold? How much smaller can we go?

Clients

With desktop computers, we have by and large mimicked the typewriter, which was commercialized in the 1860s. The typewriter in turn was just a portable printing press, which was developed in the 1400s. When we needed a portable version of the computer, we came up with the laptop and the tablet and the smartphone. It has the same display and often still a QWERTY keyboard. So, in effect, we are still modeling 600-year-old technology! With Glass and other similar technologies, I feel like they are finally trying to break that cycle. It is voice-activated and the heads-up display is integrated into the product itself. If you did need to create a document using Glass, I am not sure how you would do it (voice recognition?) but I am ever hopeful we can finally break our dependence on a 150-year-old keyboard design.

Processing and Storage

Because client devices are becoming smaller, they cannot maintain the level of on-board processing and storage that we have enjoyed in earlier versions. This is where the cloud comes in. It is almost as if smaller clients and the cloud were meant for each other. Remove the computing power from the client and move it to the cloud. Now, all you need is a client that can stay connected to the cloud and can find your setup and storage. With megadata centers hosting everything in the cloud, and increasingly reliable network connections, computing suddenly becomes much more efficient and clean. Small client, large storage and processing.

Trends For The Future

What does it take to break our dependence on the “product that came before”? How can we break out and truly reinvent how we communicate with each other and with our world? Can we find clues about our future in current and historical science fiction? We have been discussing a utopian “paper-free” world for at least thirty years. With new trends in IT, will we finally realize that utopia? Can we finally break free? These are things that I wrestle with and ponder as I envision the future of IT. How do you see the future unfolding? Are you hopeful or skeptical? For more on IT and internet trends, see the report from a recent All Things Digital conference.

 

About Kelly BrownAuthor Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT topics that keep him up at night.

 

A Path to a Greener IT

Business man with laptop in green fieldEver since the early days of the Hollerith tabulating machine, computing has relied on electricity. Computers in the 1940s and 1950s were based on vacuum tubes, which used a lot of electricity and displaced enormous amounts of heat, thus requiring even more electricity to cool them down. As transistors and integrated circuits came into use, the amount of needed electricity went down but the amount of data and associated computing went up. IT continued to be a large power consumer.

History

As computers entered into the 1960s, 1970s, and 1980s, they transformed from a mainframe to a minicomputer, to a workstation, and finally, to the personal computer. We went from one computer in a room to several computers per rack. What we saved in size, we made up in volume. We were able to consolidate the number of computing centers that we built but increased the power and cooling needs of each computer room and data center. IT continued to be a decidedly “non-green” industry.

Current Trends

Through consolidation, we now build and operate mega data centers. According to the Data Center Journal, “Mega data centers sprawl over hundreds of thousands of square feet and can exceed 10 megawatts of power, with some approaching a million square feet or 100 megawatts” (http://www.datacenterjournal.com/dcj-magazine/the-rise-of-mega-data-centers/). These data centers provide computing and data storage for small and large companies as well as individuals through services such as Dropbox. Through consolidation, many of these data centers are placed in areas that enjoy cooler temperatures, thus reducing the cooling and power requirements. Many are also placed in areas that enjoy close proximity to inexpensive clean hydroelectric power and wind power. An increasing number of companies are reducing their data center exposure in areas served by coal power, partly to save costs and partly to reduce their environmental footprint.

Future

The current trend is more towards mobile computing and away from desktop computing. This trend moves our client computing away from large fan-cooled systems, towards more efficient laptops, tablets, and smartphones. These battery-based computers still require electricity but are much more efficient than their desktop counterpart.

On the other hand, because of the diminished storage capacity of mobile systems, they rely on cloud computing and mega data centers for their processing and storage needs. The key to a greener IT future lies in maximizing the efficiency of data centers. Computer manufacturers such as IBM and Hewlett-Packard are innovating ways to cool computers through the use of increased airflow and even liquid cooling. Data center operators such as Google and Amazon are aggressively pursuing techniques such as virtual computing so that they reduce the physical computing footprint while increasing the amount of data that they can house and process. A gallery of Google data center technology http://www.google.com/about/datacenters/gallery/ shows the physical infrastructure that Google maintains. Businesses are trying to save money and reduce their computing and environmental footprint by consolidating their computing needs into cloud computing solutions. Data center providers are trying to save money and reduce their environmental footprint by reducing their power consumption. Together, we can all move toward greener, more sustainable computing.

What have you done lately to improve your computing impact on the environment?

 

About Kelly BrownAuthor Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT topics that keep him up at night.

Sick of All This Data!

Stethoscope on laptopIt appears that there is a gap between the available information technology within healthcare and the adoption of that technology. What is behind this gap? Are health care professionals simply too busy to take advantage of new technology or are the current healthcare privacy laws preventing us from using networked information tools to their fullest?

History

We have been applying technology to healthcare and disease prevention for centuries but it is only in the last fifty years that we have applied technology to healthcare information collection and dissemination. The pace of introduction and adoption is accelerating and that is causing problems with healthcare professionals and healthcare IT professionals. On the one hand, the introduction of sophisticated healthcare record management applications brings a welcome relief to an industry facing increasing privacy and record management regulations but, at the same time, it is coming on top of an already full workload. How is a healthcare professional supposed to find the time to learn and master the new systems? What is the role of the healthcare IT professional? Are we doing all we can to simplify systems and interfaces in order to accelerate adoption?

Electronic Health Records

According to the Health Information and Management Systems Society, “The Electronic Health Record (EHR) is a longitudinal electronic record of patient health information generated by one or more encounters in any care delivery setting.” This includes information on past interactions with healthcare providers as well as current and past medication history. The aim is to make this information available through an electronic interface to any healthcare provider, whether a patient is seeing their primary provider or whether they become ill while vacationing in a foreign land. With great information, however, comes great responsibility, and thus legislation such as the Health Insurance Portability and Accountability Act of 1996 (HIPAA). This creates the tension of providing available medical records through a secure and responsible infrastructure to strained healthcare providers who don’t have additional bandwidth to learn new systems and interfaces.

Interoperability of Health Care Records

Health IT will not achieve the predicted savings and efficiency until technology is more widespread and readily adopted according to a new Health Affairs report. Part of the issue of full adoption has to do with interoperability of health records. Right now, there is not a single standard for sharing health information, and vendors do not have a strong incentive to create a standard. If we couple difficult-to-use technology with the fact that a provider cannot see the full patient history across various health interactions, it is no wonder that health care professionals are reluctant to jump on board and embrace this exciting yet uncertain future.

The question then becomes: what can we do to accelerate the adoption rate of new healthcare technology and systems in order to make record keeping and retrieval easier for everyone?

 

About Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT topics that keep him up at night.

Am I in Heaven Yet?

shutterstock_127066418Cloud computing has been a buzz-word for a number of years now. Perhaps because it is such a nebulous/ethereal term (cloud?) that has been used to describe a number of different configurations and scenarios. You are most likely using some sort of cloud computing already but it is worth asking the hard questions to make sure you have the basics covered.

History

Cloud computing refers simply to the fact that your application or data is no longer on a computer that you can touch. It is hosted in a remote computer room in another city, another, state, or another country. In the “cloud”. What brought about this change, and why haven’t we always done it this way? One of the big reasons is the rising abundance and speed of networking. It used to be that your computer or terminal was tied directly to the computer in the computer room. Through better networking technology, the machine in the computer room and the computer in your hands became further and further separated until it was no longer necessary to have a dedicated room in every building. Better network security schemes has also increased this geographic gap.

Is cloud computing all tea and roses or are there still some lingering concerns? Think about these issues when creating or expanding your cloud computing strategy:

Security

If you contract with a large service provider such as Google or Amazon or IBM to host your application or data, your confidential information will be sitting in the same data center as another customer or perhaps even your competitor. Is the “wall” around your data secure enough to keep your information confidential. When your information is traveling to and from the data center over the network, is it secure? Has it been encrypted for the trip? Do you trust all of your information to the cloud or just the non-critical pieces?

Scale

Is your application and data usage large enough to warrant cloud computing? If you are a small company or non-profit agency, the setup for hosting your applications and data may swamp your entire IT budget. Some application service providers only cater to large customers with millions of transactions per month. If you don’t fall into that category then perhaps your IT person is just what you need. At the other end of the scale, some small companies or agencies use free services such as Dropbox or Google Docs. If this is the case, then check your assumptions about security.

Applications

Some applications such as customer relationship management (CRM) or simple e-mail or backups may be easily offloaded to another provider. Other applications may be complex or proprietary to the point where it makes more sense to keep them closer to the vest. They might still be a candidate in the future as you peel back the layers of legacy and move toward standard applications.

These are all questions to consider when formulating your cloud computing strategy. It can be a real cost savings to offload your computing to another provider but without careful consideration, it can become a complexity you did not bargain for. What keeps you up at night in terms of your cloud computing strategy?

 

About Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT topics that keep him up at night.

To BYOD or not to BYOD

shutterstock_128593868Bring Your Own Device or BYOD is a hot topic these days, but what’s the big deal? It seems that everyone has their own smartphone/pocket computer. We learned to deal with the Blackberry years ago. Why not blur the lines between consumer technology and business technology? Can’t we all just get along? While it may seem that your IT department is the very embodiment of Dilbert’s Mordac, The Preventer of Information Services, there is a very good reason why they are cautious and you should be too.

Security

The device belongs to the employee but the data belongs to the company. Mobile devices are great for extending our workflow, our workday, and for keeping us in constant contact. In the midst of all of this work, wherever it may happen, an employee will most likely pass company data through their mobile device, either for viewing, editing or storing. Company confidential information is worrisome enough but what about personally identifiable information (PII) belonging to your customers? Is every mobile device protected by a PIN? Is data encrypted on your device while at rest? Is data always encrypted while transiting over the network? How are employees sharing data? Over the cloud? Whose cloud? There is a lot to think about when deciding on a BYOD policy and deciding whether to allow personal devices to access your network. Bill Ho, president of Biscom has created a list of security items to consider when creating a BYOD security policy.

Platform

As the number of IT personnel has shrunk through cost cutting and rightsizing, the number of smart devices and platforms has exploded. Blackberry used to be the only game in town, but now we have Apple iOS, Android, Windows Phone, WebOS and other platforms with fun version names like Ice Cream Sandwich and Jellybean. Further up the stack, there are apps that have their own security issues. The sheer combinatorics of it all would cause any IT professional to run screaming for the network closet. To do justice to a solid BYOD policy, an organization would need at least one full time person to monitor platforms and applications that are accessing the enterprise systems. Do you have that kind of manpower? Is there a middle ground without compromising information security mentioned above?

Compatibility

Another consideration is the compatibility of all of these different devices and platforms and mobile applications and your corporate applications. Will X always talk to Y? Does it cause the IT department to scramble to get your unique permutation working for you? Is it worth the effort for your personal productivity?

Good News

There is a lot to consider when deciding to embrace BYOD. On the upside, it can extend the productivity of employees as long as security and compatibility concerns are adequately addressed. The good news is that there are tools available to help you manage mobile devices. You can find solutions from IT service providers such as IBM and Dell or from security providers such as Symantec and others. These applications can help you reach the right level of availability, convenience, and security in order for your employees to maximize their productivity and help you sleep at night.

Do you have a comprehensive BYOD plan? Is it working? What keeps you up at night?

 

About Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT topics that keep him up at night.