The Rise of #GivingTuesday

Photo of hands cradling a decorative red heart.In a recent blog I predicted the end of Cyber Monday but it turns out I was wrong, at least this year. Online retailers had their best sales day ever while Black Friday sales slumped. Perhaps my forecast will improve next year. This year I noticed something new to the holiday season—Giving Tuesday. According to givingtuesday.org, this movement was started by the 92nd Street Y in New York City in 2012. The concept has grown through social media and has been adopted by charities and other aid organizations as a reminder to give back to the community.

My new prediction is the continued growth of Giving Tuesday and I truly hope that I am right this time. There is an adage, “Where much is given, much is expected.” I know that I have been given much and I try to return that favor on Giving Tuesday and the rest of the year. In this blog, I want to encourage you to think of all that you have and how you can help others during the holiday season and beyond.

Early Lessons

One of my first lessons in giving came through a fifth grade class project. Our class assembled a large fruit basket for elderly residents of a downtown rest home. To select which class members would deliver the basket, the teacher had us choose a number between one and one thousand. I chose the first number that came into my head, which was 365, the number of days in a year. Apparently the teacher and I were on the same wavelength because I nailed it exactly. Four of us took the basket downtown and while I was nervous visiting a rest home for the first time in my life, I noticed that our presence meant even more to the residents than the basket. The gift was symbolic but they loved having us talk with them and spend time getting to know them. It was then I realized that giving of our time and talents often means as much or more than a gift.

Giving Back

Currently I give a lot of my extra time to youth organizations teaching leadership and life skills. I am hoping to influence my future by preparing these young people to lead well as they take over, which will afford me more time on the golf course. In this sense my motives are selfish, but my heart is in the right place. I also serve on a non-profit board of directors helping to provide oversight to a wonderful organization that contributes much to my community. My monetary gifts often go to medical research or directly to an individual who is struggling with health issues. I am careful about directing my money to where I think it can do the most good.

Whether you give time or money, it is important to remember that giving is about helping people and building connections.

Thoughts

I wholeheartedly support Giving Tuesday and the change they are trying to bring about. It helps to focus on giving at least one day out of the year, and hopefully that will inspire giving throughout the year. I would love to hear your stories about how you give back to your community. What causes are you most passionate about? How are you affecting change in your community and the world beyond? Let me know.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly, PDF & Email

Trends In Higher Education: Certificates and Customization

Woman works with laptop, paper and pen.I recently leafed through a course catalog of the local community college and was surprised by the breadth of certification courses. These classes lead to a professional certificate in fields such as psychology, information technology, construction, and mechanical systems. Programs may consist of one course or many courses and are taken in addition to, or in lieu of, a traditional degree program. This is specialized instruction that leads to a specific skill. These certificates show a current or potential employer that you have mastered that skill and are ready to hit the ground running. I think that certificates will become an important tool to differentiate job seekers, so I set out to find out how popular and diverse these programs are.

Certificates vs. MOOCs

Certifications can be taken at the community college, undergraduate, or even graduate level. They often lead to licensure, as in the case of specialty teaching or nursing, or may serve as preparation for taking a certification test, such as those in information technology or engineering. The programs may stand alone without an accompanying degree, or they may be taken in conjunction with an undergraduate or graduate degree. For example, law students may study technology or business to enhance their skills by broadening the experiences. In the same vein, medical students may study bioinformatics to understand and conduct genetic analysis as part of their practice. These are examples of certifications that might give job seekers an edge over other candidates.

Massive Open Online Courses, or MOOCs, are generally free and do not lead to licensure or certification. Some MOOC courses offer either option and can lead to a certificate for a fee. While these certificates are not generally recognized in the workplace, that could change in the future.

Options Beyond Certificates

Some universities are modifying their traditional degree requirements to meet the changing needs of students. Many students are returning to school or are enrolling later in life after already establishing a career. These students may need more flexibility in the course schedule or in the completion time. Some universities such as Worcester Polytechnic Institute are layering traditional degree programs with experience-based specialties. The college offers a one year master’s of management degree for young graduates, who then have the option of returning after at least two years of industry experience to add an MBA. Offering degrees in stages serves the young graduates looking for management education and returning students looking to add to their previous investment. The key to certificates or specialty degree programs is flexibility and availability of relevant curriculum.

Other schools are moving towards interdisciplinary studies degrees. This may be a combination of business, communications and information management such as the UO AIM Program, or a traditional management, engineering, health care, or law degree that allows students to explore adjacent paths in cyber security or business analytics or telemedicine. Whether these paths lead to a certificate or a degree, they all provide students with particular skills that are needed in the workplace.

Thoughts

Certificate and customizable degree programs allow students to combine the value of a traditional curriculum while gaining the specialized skills that are in demand. I think that this customization will only increase in the future as students seek innovative educational experiences. Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly, PDF & Email

The Consumerization of IT

Photo of businesspeople using devices.It used to be that information technology was the domain of specialists. In the last 10 years, the adoption of new technology has shifted to the consumer and not the enterprise. As a result, employees who were accustomed to using technology at home pushed for adoption in the workplace. This left IT groups scrambling to adapt their policies and applications to work with consumer devices and software, not always willingly.

This consumerization of technology inspired the popularity of bring your own device (BYOD) to work. The two main concerns over this trend are first and foremost security and second, compatibility with corporate applications. While it is desirable to access data and applications anytime, anywhere, and on any device, it is not always easy or safe. In this blog post I will look at the history and future trends of the IT consumerization. Will we continue as we have, or will the enterprise once again take the lead in new technology adoption?

History

Computers were originally used in government and businesses for things such as bomb trajectory calculations in World War II, tabulating voters’ ballots for presidential elections, and organizing corporate accounting activities. Operators and programmers were in charge of running the computers and any task or requests had to be fed through them. The query results came as printouts, not displayed on a desktop screen. Even as late as the mid-1980s I remember working in a large computer room where we printed stacks of paper that were set outside the computer room to be retrieved. Only computer operators and technicians were allowed inside the room. Access to the computers was through dumb terminals as input and the generated paper results as output.

Personal Computers

Apple and other companies sold computers to hobbyists in the late 1970s. While this was technically a consumer product, it was considered a niche market. When IBM introduced the personal computer in 1981, it was targeting the corporate employee, not individual consumers. When user-friendly word processing and spreadsheet software became available, consumers began buying computers for home use.

Networking

Without connecting the home computer to the outside world, people were still left with the same problem of input and output. Input came through the keyboard or from a disk, and the output came to a printer or screen or to another disk. The disks had limited capacity so to share a program or data, one had to have multiple disks that were hopefully labeled correctly. With early dial-up modems, people could finally share information (not graphics, that would take forever) with each other. As consumer networks improved, so did our desire to connect and share things with each other and the lines between work and home began to blur.

The Tipping Point

The tipping point for the consumerization of IT came with smartphones and tablets. Laptops were certainly more mobile and could go back and forth from home to work, but the smartphone and tablet made it even easier to live in both worlds. IT departments initially rejected tablets as not being robust and secure enough for the enterprise. The smartphone was even worse because it was so portable. Blackberry was one of the pioneers in bridging the gap between corporate e-mail and information systems and consumer devices. Salespeople and executives could receive information while they were with a client instead of waiting for a computer operator to process a request. It was a whole new world that continues to evolve.

Today

In my Information Systems class we talk about Bring Your Own Device (BYOD) and the tools that we need to deploy, such as Mobile Device Management (MDM), in order to integrate consumer devices into the workplace. The key for technology departments is adaptability. The lines are blurred and the genie is not going back in the bottle so we need to make sure our data and enterprise are secure while working with these devices.

In a possible reversal of trends, Deloitte predicts what they call the re-enterprization of IT in the next few years. They point to current technologies such as wearables, 3D printing, and drones being embraced by the enterprise as evidence of that reversal. I am skeptical that the consumer trend is changing just yet but I will keep my eyes open.

Thoughts

Has the consumerization of IT helped you in your work or has it caused you pain as you deal with the consequences? I don’t miss the days of wearing a separate pager and I love being able to access data from any device at any time. I also realize the work that goes into the back end to make this access seamless and I appreciate the efforts of technologists who build bridges between consumer devices and the enterprise. Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly, PDF & Email

The End of Cyber Monday

Photo of a young woman in a clothing store looking at her phone.As I write this blog entry, we are still two weeks away from Thanksgiving, Black Friday, and Cyber Monday. While I believe the first two will continue into the future, I think the term Cyber Monday has become irrelevant, largely due to technology changes, and will end this year. In this post I will lay out my reasoning for predicting its demise and invite you to give me feedback as to whether you believe Cyber Monday is doomed.

History

The term Cyber Monday was coined in 2005 by Shop.org, the digital arm of the National Retail Federation. Shop.org also runs the website Cybermonday.com in which they invite participating retailers to share their Cyber Monday shopping deals. The term refers to the Monday after Black Friday when the most online Christmas shopping is done. It was not true in 2005 but was by 2010. Now it is only one of many large online shopping days reaching back into mid-October.

Technology

I believe that the biggest threat to Cyber Monday is technology. The theory was people would go to work on Monday after the long Thanksgiving weekend and purchase all of their remaining Christmas items online using the faster company internet connection. That is now irrelevant for two reasons:

  1. Home internet connections are now fast enough to stream digital content such as movies so they are more than adequate for shopping.
  2. More people are shopping now from a mobile device such as a smartphone or tablet so they do not need to be tied to the home or office internet connection.

The term “showrooming” was coined to define the practice of visiting a store to view merchandise before ordering it online at a lower price. Best Buy has been referred to as the showroom for Amazon. In theory, you could even stand in a brick and mortar store and order the same product online through your smartphone. I think this practice will decline as we get closer to price parity between online and traditional retailers.

Web sites and apps such as Buyvia.com and Dealnews.com have taken the steam out of Cyber Monday by advertising a wide range of retail deals 365 days a year. I can define my product search and get alerts as to the best price and retailer, regardless of whether it is on Thanksgiving weekend.

Timing is everything

Retailers are creating shopping events earlier and earlier. I can already see “leaked” Black Friday ads from several retailers even though Thanksgiving is still several days away and Christmas is more than a month away. Soon we could have our Christmas shopping done in September, eliminating the whole holiday rush of late November and early December.

Thoughts

I realize that retailers will continue to roll out special deals on certain days like Black Friday and Cyber Monday, but I think that technology advances and the way that we choose to do business will make these exclusive days less of a bargain.

Am I just being a Scrooge or am I on to something? Is technology changing how and when we shop? Has Cyber Monday become irrelevant? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly, PDF & Email

The Future of Financial Technology: Small Business Transactions

Credit cards emerging from a cell phone.I have wondered about the technology that powers the economy through all of the electronic transactions we make every day. It seems to be transparent and behind the scenes until something goes wrong or a system goes offline, and then it is obvious. It has to be robust to correctly handle the millions of daily transactions. I know that the way I work with financial institutions is very different from what it was even just a few years ago. What exactly is powering all of this?

Crowdfunding Payments

In the new world of crowdfunding there are millions of online credit card transactions involved, some for as little as $5. A number of startups have stepped in to process transactions using affordable technology. WePay works with small businesses, crowdsourcing companies, and acting as the payment processor between individual buyers and sellers. They charge a 2.9% transaction fee, which is competitive, and have turned a nice profit.

Mobile Payments

Square has made mobile payments easier through a device that connects to a cellphone and processes credit card payments without a traditional hard-wire connection to the bank. I first saw these devices at craft fairs and farmers’ markets but am now starting to see them in stores, especially small shops, in place of traditional credit card transaction machines. This technology allows small businesses and individuals to process payments offline, securely and affordably.

The New Lenders

Small business owners are sometimes locked out of conventional loans because large banks cannot always verify assets or income source. Online lenders like Kabbage verify income through a business’s electronic accounts such as PayPal or Quickbooks. The loan can be used to purchase inventory, meet payroll, or expand a business. The loans are generally small and the business owner can skip the traditional loan paperwork by verifying assets through existing electronic accounts.

Thoughts

Technology has provided great tools for small businesses and individuals who are often locked out of the standard transaction or loan process either because they are too small or they are not profitable for a bank. As more people start small web-based businesses, others are there to help them. Whether their goal is to grow and expand or to continue selling quality products to a few individuals, the infrastructure and processes are already in place to help them pursue their dream.

If you are a small business owner and you have used any electronic services to process payments for sales to individuals, let me know about your experience. Is technology powering future commerce? Will traditional banks and lenders join in on this innovation or will it pass them by? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly, PDF & Email

People of Ability

Group of business people assembling jigsaw puzzleIn my volunteer positions I have worked with youth of different abilities, often called disabilities. These youth may be mentally, physically, or emotionally disabled. Despite these disabilities, they contribute greatly in various ways and have taught me many lessons that I carry with me. In this blog post I will highlight some stories of people with disabilities who contribute to the field of information technology.

Meticulon Consulting

I have worked with people on the autism spectrum who are excellent programmers. They are methodical, meticulous, and often very creative. They like routine work and excel at logic challenges such as coding. In a recent article, Meticulon Consulting, a Canadian technology firm, was highlighted for hiring people with autism. Their experiences were the same as mine. The people they hired were meticulous, hard working, and loyal to the firm. Meticulon co-founder Garth Johnson makes the point that he is not hiring people with disabilities out of sympathy but because it makes good business sense. Johnson said, “I’m not interested in this as a charity. If we can’t prove business value, then I don’t view it as sustainable for our employees, either our typically enabled or our people with autism.” Other companies cited in the article are coming to the same realization. It makes good business sense to hire people with disabilities.

SAP

The German software giant SAP shares that experience. Their goal is to have one percent of their workforce from the autism community by 2020. This goal came out of a project with the Autism Society of India after SAP programmers created software to help children with autism communicate better. The project was successful so the employees proposed a pilot project to hire adults with autism. SAP recognized the fact that these new employees come with a different perspective and a fresh set of eyes. Jose Velasco, a SAP executive and head of the Autism At Work program said, “There is a skill set people on the spectrum are bringing that has business value.”

Physical Disabilities

In our AIM Program course, Information Systems and Management, we talk about the stereotype of technology workers who are more comfortable with computers than with people. Whether the stereotype is valid or not, it has nothing to do with physical abilities. I have worked with people with hearing or vision impairments or other disabilities who love technology as much as I do. An employer may need to make some accommodations for them, but in my experience it is worth the effort; they bring a rich skillset and unique perspective to a project or an organization.

Thoughts

I believe that we need contributions from people of all abilities in order to make a strong and complete team. We all bring different skills and experiences to our work so the fact that we don’t all think alike or move the same should not make a difference. I would like to hear about your experiences working with people with different abilities. Are there benefits or drawbacks? Let me know.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly, PDF & Email

Back to the Future: Hits and Misses

Photo by Erin Cadigan

Photo by Erin Cadigan

In the 1985 film “Back To The Future II,” the characters create a time machine and travel 30 years into the future, to October 21, 2015. I am a fan of the trilogy so I have been thinking about how accurately they portrayed our current year. There were some hits and some misses.

Flying Cars

While we do not yet have commercially available flying cars we do have working autonomous vehicles. Google has prototypes driving on city streets in California. Toyota, Volvo, Audi, Mercedes Benz, Apple, and Tesla are also developing self-driving cars. Apparently it is easier to develop a self-driving car than a flying car. Now, if only we could develop an autonomous flying car, that would be really cool.

Hoverboards

In the movie, the main character rides a levitating skateboard, which he calls a hoverboard. We do have those, although they are not in mass production. Lexus recently demonstrated a hoverboard, partly to coincide with the date in the movie, which may be the first step toward their goal of developing a levitating car. If they succeed, we could some day have flying cars, but they wouldn’t fly high up in the air like in “Back To The Future” or “The Jetsons.”

Fax Machines in Every Room

In one scene of the movie, they show a home with multiple fax machines. I think we moved past this technology. Fax machines are still available as standalone machines or integrated into scanner/printers, but faxing has largely been replaced by other electronic communication methods. Now we have screens in every room and in every hand.

Large Screen Advertising

When the main character arrives in the future, there is outdoor advertising everywhere on large screens, almost to distraction. I think we have this one covered. I can drive down the highway now and see full color video on billboards. In 1985 who would have thought we would have 60-inch high definition televisions in our homes? In terms of screen size, we subscribe to the “bigger is better” philosophy. The largest current sports arena screen is the Jumbotron in Houston, which measures 52 feet high and 277 feet wide. We have definitely figured out how to make large displays.

Thoughts

Some things, such as flying cars, have been anticipated since the 1950s, but we haven’t quite perfected them. Other predictions are already old school. I wonder if movie scripts mimic our ingenuity and development, or is it the other way around? If we were to make a movie today portraying 2045, what would it look like? Will we all still be walking around looking at six-inch screens or will we have integrated our viewing into wearables such as glasses and holographic projections? What do you predict for the future? Let me know your thoughts and I will circle back in 2045 to see if you are right.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly, PDF & Email

Optimism Bias: How Your Half Full Glass Leaves You Vulnerable

Photo of a woman holding a large happy face sign in front of her face.There is a cognitive phenomenon called the optimism bias that leads humans to think that the likelihood of a negative event is lower than it really is. This is great when we are battling the day-to-day stresses of our lives but not so good when trying to plan for unexpected risks. In this blog post I will explore how the optimism bias can affect risk management.

Tigger or Eeyore

In a recent article for the Nonprofit Risk Management Center, Erin Gloeckner describes different personality types as Tiggers, or people who are always positive and bouncy, and Eeyores, those who are always down and negative. In reality, most of us fall somewhere in between but tend to have an optimism bias. As I have mentioned in previous posts, I tend to be an uber-Tigger and that can get me into trouble when determining the likelihood of failure or disaster. I was once asked to develop potential disaster scenarios for a project so that I could mitigate any risks associated with those scenarios. Try as I might, I could not come up with any realistic scenarios that involved failures. I realized my own bias towards optimism and asked for help from a project member I knew had a negative bias. That person was able to develop many different disaster scenarios and we created risk mitigation plans to counter each of them. True to my world, none of those scenarios ever happened but we were prepared nonetheless.

Business Planning

When managing information, it’s important to have a realistic sense of security. Start with preparing honest answers to these questions:

  • What are the chances of a security breach that leads to leaked confidential information?
  • What are the chances of a natural disaster that affects the operations of my organization?
  • What are the chances that I will lose a key person in my organization, at least temporarily?

It is important to have plans in place to counter the various threats that can happen in the course of doing business. Storms don’t stay away forever and key people don’t stay in one position their whole life. We can lessen the impact of these events by planning for them.

Personal Planning

I have talked about this in past blog posts but I think it is also important to evaluate potential risks in our personal lives. Ask yourself:

  • What are the chances that I could lose my current job?
  • What are the chances that I could suffer health problems?
  • What are the chances that a natural disaster could affect me or my family?

While it is not good to dwell on these scenarios to the point of distraction, it will give you peace of mind to know that you have planned to mitigate risks. These mitigation strategies should include making sure your skills and education are up to date and that you are exercising in order to fend off avoidable health problems. Set aside money to counter any unforeseen financial problems. Just as you plan for business disruptions, you can also plan for personal issues. These plans can help you sleep at night and be a Tigger all day.

Thoughts

If you are interested in learning more about the optimism bias, there is an excellent 2012 TED talk by Talit Sharot that covers the topic. Whether you tend to be an Eeyore or a Tigger, it is important to recognize your biases as you make plans for your business and your life. Do you already know your personal biases? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly, PDF & Email

Will Computer Science Displace Classic Education?

Photo of 4 elementary school children typing at desktop computers.I believe that technology is now a routine part of our lives and I have been thinking lately about how much effort we should spend educating young students about computers. I read an article that highlighted a push to make computer science mandatory in German schools. My question is, has technology become so commonplace that we treat it like running water and electricity, or can it still provide a competitive advantage for a community or a nation?

Keeping up on Technology

One of the concerns of German lawmakers, which is shared by officials from other countries, is that their students will fall behind and not be able to fill future technology jobs. According to the head of German digital industry group Bitkom:

“IT skills are now as important as the basics. Digitisation determines our everyday lives more and more, for leisure time as well as for work. Schools must teach about media literacy beyond the classroom and give students a firm grasp of IT technologies.”

Suddenly, the tech kids are the cool ones in school. This follows the recent emphasis in schools in science, technology, engineering, and math (STEM). The theory is that partly because of the proliferation of technology, the best and most advanced jobs will go to those who are trained in those areas.

Code.org

In a blog post last year I highlighted the organization Code.org that believes that “every student in every school should have the opportunity to learn computer science.” They are working to increase access to computer curriculum, particularly for women and students of color. Just as the lawmakers in Germany are advocating, Code.org believes that computer science should be part of core curriculum in schools alongside biology, algebra, and chemistry. While I agree that computer science is important as part of a STEM curriculum, I wonder which classes we should drop to make room for it?

Curriculum Replacement

A recent PBS article highlighted a similar push to introduce coding courses in schools in Australia. Computer science curriculum, according to the article, will replace geography and history courses. I am sure that the change will generate a lot of debate around the virtues of a classic education versus a more modern education. It leaves the door open for ongoing conversations around curriculum mix and what students actually need to succeed in the future.

Thoughts

To circle back to my original question, is it necessary to add specific computer science curriculum to schools? Or has technology become so pervasive that everyone knows how to use it, but only a few need to be able to create new and unique applications? In the same vein, should we also introduce mandatory physics courses as well to better understand the underlying hardware? Finally, which courses would you replace? As you look back on your education and career, which classes have shaped you the most and why? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly, PDF & Email

Trends in Higher Education

Stock photo of a hand about to click Go when searching with the term University.The Boston Consulting Group published an article recently that highlighted trends in higher education. This piece did a good job covering those that are emerging. I want to examine the convergence of several of them and how I think technology will play a part in shaping that future.

Funding

State colleges and universities have long relied on government subsidies to keep tuition at a manageable rate and fund all of the research and activities associated with the school. In recent years the amount of funding coming from the states has dropped as they struggle to balance their own budgets. The shortfall is made up through increased tuition and grants as well as targeted campaigns aimed at private and corporate donors. Increased tuition is problematic due to the large debt graduates are accumulating. A recent article in U.S. News & World Report detailed how some graduates are carrying student loan debt into their forties, which means they cannot help their children start academic careers. The result is that the children are assuming their own debt, which continues the cycle. Generating alternative funding sources or containing operational costs could help break that cycle.

Competition

There are more education options available to students. Schools across the country, and even some international schools, are offering attractive incentives to reel in young scholars who might otherwise attend their state university. There’s also been a spike in online curriculum and for-profit schools. In this competitive environment universities must target the right prospective students and then lure them in. With the drop in state funding mentioned above, many universities are pursuing more international students, who pay a higher tuition. All of this requires a targeted, intelligent marketing campaign.

Increased Research

Partnerships with private industry are helping universities increase their research efforts. These partners provide funds for sophisticated research, the results of which can be licensed back to the partner or sold outright. Top-notch students and faculty are drawn to such projects, industry gains new business ideas and opportunities, and students and potential employers are brought together.

Thoughts

Colleges and universities are facing pressure from increased competition, uncertain funding, and the push to accelerate and capitalize on research. Here are ways that I think technology can help alleviate that pressure:

  • Social Media. Universities are increasing their use of social media to reach a tech savvy generation from around the globe. Advances in web and media technologies as well as analytics help schools target the right audiences and markets.
  • Big Data and Business Analytics. The ability to quickly analyze large amounts of prospective student data helps colleges narrow their search for potential students. By identifying and targeting particular demographics, schools can reduce marketing costs and increase the efficiency of their search campaigns.
  • Collaboration Software. Research partnerships are no longer just with the company down the street. Partners can be thousands of miles away so it is important that schools and private enterprises can communicate, catalog and analyze research results in a systematic and predictable way. Collaboration applications can help keep researchers informed and successful.

While colleges and universities are facing funding and competition pressures, there are technologies that can help lessen those concerns and lead to new knowledge and discoveries. I am hoping this post spurs your thoughts on other ways that technology can or is helping higher education.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly, PDF & Email