Tag Archives: networking

Leadership in a Connected World

Key figures in a connected crowd.In the AIM Program’s Information Systems and Management course, we talk about leadership and management. Are they the same thing? Can someone be a good leader but a terrible manager, or vice versa? These are good questions. I have been studying how leadership has changed in the last 100 years as we shifted from leaders who oversee one or more factories in a region to leaders who command large global enterprises. In the past, a manager could walk down to the factory floor and talk with each employee, but modern telecommunications have allowed us to create large businesses and to manage those businesses from a distance. What original principles of leadership remain the same and which have changed?

The Personal Touch

I have read autobiographies of Sam Walton of Wal-Mart, Harland Sanders, also known as Colonel Sanders, and Dave Packard of Hewlett-Packard. As they recount the early days of their businesses they all talk about knowing and interacting with the employees. Part of their leadership style was personal contact, which allowed adjustments to the business model based on employee feedback. According to the Wal-Mart website, the company now employs 2.2 million associates worldwide. How does a leader manage so many people in a geographically dispersed firm?

Networking

One of the answers is focused networking through the use of technology. Even though large organizations still use traditional organizational charts, it takes a long time for a complaint to make it through 10–12 layers of management to be heard and acted upon. This is the explicit organization as depicted on the chart. In reality, there is often a parallel, implicit organization that everyone knows about but which is seldom put into writing or a visual. There are touchstones in the organization who “know the right people” and can bypass the traditional structure to get things done. Author Malcolm Gladwell refers to these people as “connectors.” Employees quickly identify touchstones and rally their support in championing new ideas or settling a grievance. Think about how long it takes to disseminate information in your organization and how long it takes to make a low-level decision that for some reason requires multiple signatures. Could you employ this implicit structure for sharing information or collecting feedback quicker?

Leadership in The 21st Century

I believe it is important to recognize this alternate organization and utilize it for disseminating information. We can always do a one-to-many announcement but it is not always effective, nor is it well-received. Touchstones are likely to relay messages quicker. Marrying this network approach with social media channels allows us to still be effective leaders even though we are now steering an ocean liner instead of a bicycle. Such methods are not meant to subvert the traditional organizational structure but to provide a quicker and more effective means of communication through modern technology and networking. Those leaders recognize that it is not enough to have a large number of connections but they also need to be linked to the right people to institute change and move the organization forward.

Thoughts

Do you know the connectors in your organization? Are they in your network? Are you someone others turn to? It is important for leaders to make use of the implicit network just as we work the traditional structure. It is getting harder to effectively lead thousands, if not millions of employees and we need all the advantages we can get. Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

The Consumerization of IT

Photo of businesspeople using devices.It used to be that information technology was the domain of specialists. In the last 10 years, the adoption of new technology has shifted to the consumer and not the enterprise. As a result, employees who were accustomed to using technology at home pushed for adoption in the workplace. This left IT groups scrambling to adapt their policies and applications to work with consumer devices and software, not always willingly.

This consumerization of technology inspired the popularity of bring your own device (BYOD) to work. The two main concerns over this trend are first and foremost security and second, compatibility with corporate applications. While it is desirable to access data and applications anytime, anywhere, and on any device, it is not always easy or safe. In this blog post I will look at the history and future trends of the IT consumerization. Will we continue as we have, or will the enterprise once again take the lead in new technology adoption?

History

Computers were originally used in government and businesses for things such as bomb trajectory calculations in World War II, tabulating voters’ ballots for presidential elections, and organizing corporate accounting activities. Operators and programmers were in charge of running the computers and any task or requests had to be fed through them. The query results came as printouts, not displayed on a desktop screen. Even as late as the mid-1980s I remember working in a large computer room where we printed stacks of paper that were set outside the computer room to be retrieved. Only computer operators and technicians were allowed inside the room. Access to the computers was through dumb terminals as input and the generated paper results as output.

Personal Computers

Apple and other companies sold computers to hobbyists in the late 1970s. While this was technically a consumer product, it was considered a niche market. When IBM introduced the personal computer in 1981, it was targeting the corporate employee, not individual consumers. When user-friendly word processing and spreadsheet software became available, consumers began buying computers for home use.

Networking

Without connecting the home computer to the outside world, people were still left with the same problem of input and output. Input came through the keyboard or from a disk, and the output came to a printer or screen or to another disk. The disks had limited capacity so to share a program or data, one had to have multiple disks that were hopefully labeled correctly. With early dial-up modems, people could finally share information (not graphics, that would take forever) with each other. As consumer networks improved, so did our desire to connect and share things with each other and the lines between work and home began to blur.

The Tipping Point

The tipping point for the consumerization of IT came with smartphones and tablets. Laptops were certainly more mobile and could go back and forth from home to work, but the smartphone and tablet made it even easier to live in both worlds. IT departments initially rejected tablets as not being robust and secure enough for the enterprise. The smartphone was even worse because it was so portable. Blackberry was one of the pioneers in bridging the gap between corporate e-mail and information systems and consumer devices. Salespeople and executives could receive information while they were with a client instead of waiting for a computer operator to process a request. It was a whole new world that continues to evolve.

Today

In my Information Systems class we talk about Bring Your Own Device (BYOD) and the tools that we need to deploy, such as Mobile Device Management (MDM), in order to integrate consumer devices into the workplace. The key for technology departments is adaptability. The lines are blurred and the genie is not going back in the bottle so we need to make sure our data and enterprise are secure while working with these devices.

In a possible reversal of trends, Deloitte predicts what they call the re-enterprization of IT in the next few years. They point to current technologies such as wearables, 3D printing, and drones being embraced by the enterprise as evidence of that reversal. I am skeptical that the consumer trend is changing just yet but I will keep my eyes open.

Thoughts

Has the consumerization of IT helped you in your work or has it caused you pain as you deal with the consequences? I don’t miss the days of wearing a separate pager and I love being able to access data from any device at any time. I also realize the work that goes into the back end to make this access seamless and I appreciate the efforts of technologists who build bridges between consumer devices and the enterprise. Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Robotic-Assisted Surgical Technology

Photograph of a human hand shaking a robot hand in solidarity.Last week my wife and I were driving down the road and came across a billboard advertising “robot-assisted bariatric surgery” at our local hospital. While we weren’t interested in the particular surgery, it did spur one of our many philosophical debates. Our topic of conversation was this: would you rather be operated on by a robot or by a human? I would take the robot any day of the week. They are steadier, more consistent, and they have nothing on their mind but my surgery. The downside is that they are not as creative. My wife would prefer the human. She fears the robot would crash in the middle of the surgery or the pinwheel of doom would appear and stop the surgery prematurely. But this is robot-assisted, so it is a marriage of consistency, accuracy, and creativity. In any case, I had not heard of robot-assisted surgery so I had to learn more.

Blessed by the FDA

The Food and Drug Administration (FDA) approved robot-assisted surgery in 2000 with the introduction of the daVinci Surgery robot. This tool allows a doctor to sit at a console near the patient and control robotic arms. These arms enter the body through tiny incisions and have a camera, a light, or a selection of wristed instruments on the end. The surgeon can then see a magnified, high definition picture of the area and can guide the tools through the body using a joystick-like interface. The doctor’s movements are translated into much smaller motions by the instruments, which have humanlike wrist movement. The upsides are:

  1. Small incisions vs. open surgery.
  2. Quicker recovery.
  3. Less chance of infection.
  4. Potentially greater accuracy.
  5. Less invasive procedure may mean faster healing.

Since being approved, surgeons using these robots have performed thousands of surgeries in areas such as gastroenterology and cardiac surgery. This is laparoscopic surgery merged with robotic technology to provide for even more accuracy and finesse.

The Future

This is an incredible use of technology to assist skilled doctors in performing critical and delicate surgeries. One of the future improvements is telerobotic surgery where the surgeon is not even in the room and could control the robot from anywhere. This requires rock solid networking. Another potential development is completely robotic surgery. This would require preprogramming and very accurate vision and recognition systems.

Thoughts

I am excited about this use of technology and the future possibilities of advancements in this field. This will require new technical skills and new training to ensure that all systems are functioning and that the infrastructure supplying these systems is foolproof. How would you feel about being worked over by a robot directed by a skilled surgeon? Do you trust the two working together? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Am I in Heaven Yet?

shutterstock_127066418Cloud computing has been a buzz-word for a number of years now. Perhaps because it is such a nebulous/ethereal term (cloud?) that has been used to describe a number of different configurations and scenarios. You are most likely using some sort of cloud computing already but it is worth asking the hard questions to make sure you have the basics covered.

History

Cloud computing refers simply to the fact that your application or data is no longer on a computer that you can touch. It is hosted in a remote computer room in another city, another, state, or another country. In the “cloud”. What brought about this change, and why haven’t we always done it this way? One of the big reasons is the rising abundance and speed of networking. It used to be that your computer or terminal was tied directly to the computer in the computer room. Through better networking technology, the machine in the computer room and the computer in your hands became further and further separated until it was no longer necessary to have a dedicated room in every building. Better network security schemes has also increased this geographic gap.

Is cloud computing all tea and roses or are there still some lingering concerns? Think about these issues when creating or expanding your cloud computing strategy:

Security

If you contract with a large service provider such as Google or Amazon or IBM to host your application or data, your confidential information will be sitting in the same data center as another customer or perhaps even your competitor. Is the “wall” around your data secure enough to keep your information confidential. When your information is traveling to and from the data center over the network, is it secure? Has it been encrypted for the trip? Do you trust all of your information to the cloud or just the non-critical pieces?

Scale

Is your application and data usage large enough to warrant cloud computing? If you are a small company or non-profit agency, the setup for hosting your applications and data may swamp your entire IT budget. Some application service providers only cater to large customers with millions of transactions per month. If you don’t fall into that category then perhaps your IT person is just what you need. At the other end of the scale, some small companies or agencies use free services such as Dropbox or Google Docs. If this is the case, then check your assumptions about security.

Applications

Some applications such as customer relationship management (CRM) or simple e-mail or backups may be easily offloaded to another provider. Other applications may be complex or proprietary to the point where it makes more sense to keep them closer to the vest. They might still be a candidate in the future as you peel back the layers of legacy and move toward standard applications.

These are all questions to consider when formulating your cloud computing strategy. It can be a real cost savings to offload your computing to another provider but without careful consideration, it can become a complexity you did not bargain for. What keeps you up at night in terms of your cloud computing strategy?

 

About Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT topics that keep him up at night.