Tag Archives: software

Digital Transformation: Path to Improving Your Business

This is an open letter to businesses and agencies attempting to transform their enterprise through the use of digital technologies. Each organization is at a different point along this path in an effort to engage customers, suppliers, and employees through digital technologies in order to remain competitive and profitable. I would like to suggest some ways to accelerate that transformation.

New Technologies

Management consultants Bain and Company suggested in a recent article that there are six basic design rules that can accelerate a company’s digital transformation. These include breaking boundaries, being open, inducing insights, and being user-friendly. I would like to add a couple of others that I think will help move you down the road to your destination.

Internal Partnering

Many companies are reworking their internal and external processes to achieve efficiency and build a digital presence that will hopefully draw customers. Even my local hardware store and ice cream shop have websites. They are for the most part static pages with information like location, phone number and store hours but at least that keeps me from having to dust off the yellow pages. They have taken the first steps towards moving to a digital world.

Whether you are moving back-end infrastructure, applications and software to the cloud or experimenting with a web presence for the first time, it is important to partner with your technology department. As a business, you know WHAT you want to do but the employees in your information management department know HOW to do it. Partner with them at every turn to combine business knowledge and technical knowledge. I suggest you even consider embedding some technical people in your business. This is a great way for them to learn more about your needs so that they can custom design a solution for you. We used to worry about technology people “going native” if they were embedded in the business, but now I think that crossover is necessary and will result in better and more effective solutions.

External Partnering

One of the design rules from Bain is to focus on the user experience. What better way to do that than ask users themselves? Sometimes this requires getting out of the office and asking customers their opinion of a new mobile app or a change to your website or even a new digital product that you are considering. I will be the first to admit that traditional surveys leave me cold. Every time I get near the local Home Depot store my smart phone asks me to rate my recent visit. I never comply. If a business I frequent were to put a device in my hand and ask me to try out a new digital product I would be much more inclined to reply. There are different ways of partnering with and surveying customers, but it is important in order to design a user experience they will accept.


Wherever you are on the digital transformation continuum, I hope you will consider these ideas to make your journey smoother. They can help you in implementation and customer engagement. How is your digital transformation progressing? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

A Photonic Life—Our Rapidly Increasing Computer Processing Speed

Abstract sine waves, numbers and design elements I have been thinking this last week about the current advances in technology and what they will mean for us in future computing systems and how we do computing. HP announced last week they are working on a new machine that is dubbed simply “The Machine.” While the moniker is not very inspiring, the technology is groundbreaking. HP is working on two different technologies they will build into The Machine, memristors and silicon photonics. These technologies will need an entirely new computer and operating system wrapped around them. I think that will present some opportunities for forward-thinking information technology professionals willing to blaze some trails.


Memristors or “memory resistors” were first proposed by Leon Chua in 1971. This is basically a resistor that remembers its state when electricity is turned off. The first silicon based memristor was announced in 2012, though there is still much work to do to make them commercially viable. Their value is that you can use these as storage now and they can be right on the same board, or even the same chip, as the processing unit and can replace offline disk storage. The ability to access information in such close physical proximity to the processor will boost access speed exponentially. Instead of having dual core or quad core or eight-way core, you can now have a multicore processor.


Photonics, or the process of transferring information via light, is not a new process but it is shrinking. Fiber optic cable allows us to easily transfer information and voice across the ocean and is increasingly used within buildings as well. It is faster than copper and requires less energy. What is new is the application of photonics. It is being shrunk now to the point of transferring information across a blade server and even between blade servers in the same rack across the backplane. With this miniaturization, it takes some very creative nano technologies to create the path for transferring those light pulses. This, combined with the new memristors, yield data access rates much faster than ever before. An added benefit is increased energy efficiency because copper paths tend to lose strength and need repeaters to refresh the data. Those repeaters add to the overall heat given off and energy consumed.


Part of the push to create faster computers and faster networking is because we are so data rich right now we cannot process it fast enough. We became data rich in the first place by building fast, low-cost computers and storage that allowed us to collect statistics on anything and everything. I am wondering now if the dog is wagging the tail, or is the tail wagging the dog? In all of this, through change comes opportunity. Today’s programs and operating systems are constrained by the current hardware. If the current hardware changes to the point that there is no delay in data access or processing, there will need to be new software, new applications, and a new operating system. I believe that infrastructure will need to be built from the ground up to maximize the capabilities of the new hardware. Is there anyone out there up for the challenge? Let me know your thoughts. We only have a few years before the future is here.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.


Simplicity on the Other Side of Complexity—Quality Does Matter

Road and roadsigns in the form of a question markI have been thinking recently about software and product quality. There is a software quality conference this fall here in the Pacific Northwest and I recently read an article on the top ten software blunders of the last decade. As we rush products to market, are we compromising quality? What negative effect does that have on our product? Is it worth it? Is it acceptable? Is it the price we pay for doing business in a hypercompetitive world?

Continuous Exploits

In late April, it was discovered that there was yet another hole in Internet Explorer that allowed hackers to exploit vulnerabilities and plant malicious code on individual computers via infected websites. This is just one example of applications and operating systems with bugs waiting to be exploited. My question is this—are product developers and quality assurance teams releasing inferior, not-yet-ready–for-prime-time products, or are the products so complicated that developers do not understand all of the implications until after they have been tested by consumers? If it is the former, then the answer is to wait until all of the bugs are detected and corrected to release a superior product. If the answer is the latter, then that means that you and I are paying for the privilege of being product testers. Personally, I can think of better things to do with my time and money.

A Simplistic View

I will admit that I may be taking a simplistic view. My experience runs towards hardware products and support, although there are still quality products in that arena as well. According to Microsoft, Windows XP, which was released in 2001 and recently became unsupported, was compiled from forty-five million lines of code. Thirteen years later we have Windows 8.1. How many lines of code are in this operating system? Is the complexity sustainable or are we building products that we cannot manage? With this increasing complexity, have we resigned ourselves to a certain number of acceptable bugs? What is our tolerance level? One percent of nonfunctioning or potentially compromising code? Is that acceptable?


Nineteenth-century writer Oliver Wendell Holmes once said “I would not give a fig for the simplicity on this side of complexity, but I would give my life for the simplicity on the other side of complexity.” I believe that we are stuck in the middle of that complexity right now. While our products are sophisticated, they lack that elegance on the other side of complexity. We have learned to write incredibly complex code, which is understood in part by individual coders but in entirety by no one. This is the very thing that makes that code vulnerable to exploits and security breaches. If we could somehow find that simplicity or elegance on the other side of complexity, then we could enjoy robust, secure, and usable products.

Do you have or use a product or application that you think has broken through that complexity curtain? Share your find with me.

About Kelly BrownAuthor Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.