The End of Cyber Monday

Photo of a young woman in a clothing store looking at her phone.As I write this blog entry, we are still two weeks away from Thanksgiving, Black Friday, and Cyber Monday. While I believe the first two will continue into the future, I think the term Cyber Monday has become irrelevant, largely due to technology changes, and will end this year. In this post I will lay out my reasoning for predicting its demise and invite you to give me feedback as to whether you believe Cyber Monday is doomed.


The term Cyber Monday was coined in 2005 by, the digital arm of the National Retail Federation. also runs the website in which they invite participating retailers to share their Cyber Monday shopping deals. The term refers to the Monday after Black Friday when the most online Christmas shopping is done. It was not true in 2005 but was by 2010. Now it is only one of many large online shopping days reaching back into mid-October.


I believe that the biggest threat to Cyber Monday is technology. The theory was people would go to work on Monday after the long Thanksgiving weekend and purchase all of their remaining Christmas items online using the faster company internet connection. That is now irrelevant for two reasons:

  1. Home internet connections are now fast enough to stream digital content such as movies so they are more than adequate for shopping.
  2. More people are shopping now from a mobile device such as a smartphone or tablet so they do not need to be tied to the home or office internet connection.

The term “showrooming” was coined to define the practice of visiting a store to view merchandise before ordering it online at a lower price. Best Buy has been referred to as the showroom for Amazon. In theory, you could even stand in a brick and mortar store and order the same product online through your smartphone. I think this practice will decline as we get closer to price parity between online and traditional retailers.

Web sites and apps such as and have taken the steam out of Cyber Monday by advertising a wide range of retail deals 365 days a year. I can define my product search and get alerts as to the best price and retailer, regardless of whether it is on Thanksgiving weekend.

Timing is everything

Retailers are creating shopping events earlier and earlier. I can already see “leaked” Black Friday ads from several retailers even though Thanksgiving is still several days away and Christmas is more than a month away. Soon we could have our Christmas shopping done in September, eliminating the whole holiday rush of late November and early December.


I realize that retailers will continue to roll out special deals on certain days like Black Friday and Cyber Monday, but I think that technology advances and the way that we choose to do business will make these exclusive days less of a bargain.

Am I just being a Scrooge or am I on to something? Is technology changing how and when we shop? Has Cyber Monday become irrelevant? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly

The Future of Financial Technology: Small Business Transactions

Credit cards emerging from a cell phone.I have wondered about the technology that powers the economy through all of the electronic transactions we make every day. It seems to be transparent and behind the scenes until something goes wrong or a system goes offline, and then it is obvious. It has to be robust to correctly handle the millions of daily transactions. I know that the way I work with financial institutions is very different from what it was even just a few years ago. What exactly is powering all of this?

Crowdfunding Payments

In the new world of crowdfunding there are millions of online credit card transactions involved, some for as little as $5. A number of startups have stepped in to process transactions using affordable technology. WePay works with small businesses, crowdsourcing companies, and acting as the payment processor between individual buyers and sellers. They charge a 2.9% transaction fee, which is competitive, and have turned a nice profit.

Mobile Payments

Square has made mobile payments easier through a device that connects to a cellphone and processes credit card payments without a traditional hard-wire connection to the bank. I first saw these devices at craft fairs and farmers’ markets but am now starting to see them in stores, especially small shops, in place of traditional credit card transaction machines. This technology allows small businesses and individuals to process payments offline, securely and affordably.

The New Lenders

Small business owners are sometimes locked out of conventional loans because large banks cannot always verify assets or income source. Online lenders like Kabbage verify income through a business’s electronic accounts such as PayPal or Quickbooks. The loan can be used to purchase inventory, meet payroll, or expand a business. The loans are generally small and the business owner can skip the traditional loan paperwork by verifying assets through existing electronic accounts.


Technology has provided great tools for small businesses and individuals who are often locked out of the standard transaction or loan process either because they are too small or they are not profitable for a bank. As more people start small web-based businesses, others are there to help them. Whether their goal is to grow and expand or to continue selling quality products to a few individuals, the infrastructure and processes are already in place to help them pursue their dream.

If you are a small business owner and you have used any electronic services to process payments for sales to individuals, let me know about your experience. Is technology powering future commerce? Will traditional banks and lenders join in on this innovation or will it pass them by? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly

People of Ability

Group of business people assembling jigsaw puzzleIn my volunteer positions I have worked with youth of different abilities, often called disabilities. These youth may be mentally, physically, or emotionally disabled. Despite these disabilities, they contribute greatly in various ways and have taught me many lessons that I carry with me. In this blog post I will highlight some stories of people with disabilities who contribute to the field of information technology.

Meticulon Consulting

I have worked with people on the autism spectrum who are excellent programmers. They are methodical, meticulous, and often very creative. They like routine work and excel at logic challenges such as coding. In a recent article, Meticulon Consulting, a Canadian technology firm, was highlighted for hiring people with autism. Their experiences were the same as mine. The people they hired were meticulous, hard working, and loyal to the firm. Meticulon co-founder Garth Johnson makes the point that he is not hiring people with disabilities out of sympathy but because it makes good business sense. Johnson said, “I’m not interested in this as a charity. If we can’t prove business value, then I don’t view it as sustainable for our employees, either our typically enabled or our people with autism.” Other companies cited in the article are coming to the same realization. It makes good business sense to hire people with disabilities.


The German software giant SAP shares that experience. Their goal is to have one percent of their workforce from the autism community by 2020. This goal came out of a project with the Autism Society of India after SAP programmers created software to help children with autism communicate better. The project was successful so the employees proposed a pilot project to hire adults with autism. SAP recognized the fact that these new employees come with a different perspective and a fresh set of eyes. Jose Velasco, a SAP executive and head of the Autism At Work program said, “There is a skill set people on the spectrum are bringing that has business value.”

Physical Disabilities

In our AIM Program course, Information Systems and Management, we talk about the stereotype of technology workers who are more comfortable with computers than with people. Whether the stereotype is valid or not, it has nothing to do with physical abilities. I have worked with people with hearing or vision impairments or other disabilities who love technology as much as I do. An employer may need to make some accommodations for them, but in my experience it is worth the effort; they bring a rich skillset and unique perspective to a project or an organization.


I believe that we need contributions from people of all abilities in order to make a strong and complete team. We all bring different skills and experiences to our work so the fact that we don’t all think alike or move the same should not make a difference. I would like to hear about your experiences working with people with different abilities. Are there benefits or drawbacks? Let me know.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly

Back to the Future: Hits and Misses

Photo by Erin Cadigan

Photo by Erin Cadigan

In the 1985 film “Back To The Future II,” the characters create a time machine and travel 30 years into the future, to October 21, 2015. I am a fan of the trilogy so I have been thinking about how accurately they portrayed our current year. There were some hits and some misses.

Flying Cars

While we do not yet have commercially available flying cars we do have working autonomous vehicles. Google has prototypes driving on city streets in California. Toyota, Volvo, Audi, Mercedes Benz, Apple, and Tesla are also developing self-driving cars. Apparently it is easier to develop a self-driving car than a flying car. Now, if only we could develop an autonomous flying car, that would be really cool.


In the movie, the main character rides a levitating skateboard, which he calls a hoverboard. We do have those, although they are not in mass production. Lexus recently demonstrated a hoverboard, partly to coincide with the date in the movie, which may be the first step toward their goal of developing a levitating car. If they succeed, we could some day have flying cars, but they wouldn’t fly high up in the air like in “Back To The Future” or “The Jetsons.”

Fax Machines in Every Room

In one scene of the movie, they show a home with multiple fax machines. I think we moved past this technology. Fax machines are still available as standalone machines or integrated into scanner/printers, but faxing has largely been replaced by other electronic communication methods. Now we have screens in every room and in every hand.

Large Screen Advertising

When the main character arrives in the future, there is outdoor advertising everywhere on large screens, almost to distraction. I think we have this one covered. I can drive down the highway now and see full color video on billboards. In 1985 who would have thought we would have 60-inch high definition televisions in our homes? In terms of screen size, we subscribe to the “bigger is better” philosophy. The largest current sports arena screen is the Jumbotron in Houston, which measures 52 feet high and 277 feet wide. We have definitely figured out how to make large displays.


Some things, such as flying cars, have been anticipated since the 1950s, but we haven’t quite perfected them. Other predictions are already old school. I wonder if movie scripts mimic our ingenuity and development, or is it the other way around? If we were to make a movie today portraying 2045, what would it look like? Will we all still be walking around looking at six-inch screens or will we have integrated our viewing into wearables such as glasses and holographic projections? What do you predict for the future? Let me know your thoughts and I will circle back in 2045 to see if you are right.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly

Optimism Bias: How Your Half Full Glass Leaves You Vulnerable

Photo of a woman holding a large happy face sign in front of her face.There is a cognitive phenomenon called the optimism bias that leads humans to think that the likelihood of a negative event is lower than it really is. This is great when we are battling the day-to-day stresses of our lives but not so good when trying to plan for unexpected risks. In this blog post I will explore how the optimism bias can affect risk management.

Tigger or Eeyore

In a recent article for the Nonprofit Risk Management Center, Erin Gloeckner describes different personality types as Tiggers, or people who are always positive and bouncy, and Eeyores, those who are always down and negative. In reality, most of us fall somewhere in between but tend to have an optimism bias. As I have mentioned in previous posts, I tend to be an uber-Tigger and that can get me into trouble when determining the likelihood of failure or disaster. I was once asked to develop potential disaster scenarios for a project so that I could mitigate any risks associated with those scenarios. Try as I might, I could not come up with any realistic scenarios that involved failures. I realized my own bias towards optimism and asked for help from a project member I knew had a negative bias. That person was able to develop many different disaster scenarios and we created risk mitigation plans to counter each of them. True to my world, none of those scenarios ever happened but we were prepared nonetheless.

Business Planning

When managing information, it’s important to have a realistic sense of security. Start with preparing honest answers to these questions:

  • What are the chances of a security breach that leads to leaked confidential information?
  • What are the chances of a natural disaster that affects the operations of my organization?
  • What are the chances that I will lose a key person in my organization, at least temporarily?

It is important to have plans in place to counter the various threats that can happen in the course of doing business. Storms don’t stay away forever and key people don’t stay in one position their whole life. We can lessen the impact of these events by planning for them.

Personal Planning

I have talked about this in past blog posts but I think it is also important to evaluate potential risks in our personal lives. Ask yourself:

  • What are the chances that I could lose my current job?
  • What are the chances that I could suffer health problems?
  • What are the chances that a natural disaster could affect me or my family?

While it is not good to dwell on these scenarios to the point of distraction, it will give you peace of mind to know that you have planned to mitigate risks. These mitigation strategies should include making sure your skills and education are up to date and that you are exercising in order to fend off avoidable health problems. Set aside money to counter any unforeseen financial problems. Just as you plan for business disruptions, you can also plan for personal issues. These plans can help you sleep at night and be a Tigger all day.


If you are interested in learning more about the optimism bias, there is an excellent 2012 TED talk by Talit Sharot that covers the topic. Whether you tend to be an Eeyore or a Tigger, it is important to recognize your biases as you make plans for your business and your life. Do you already know your personal biases? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly

Will Computer Science Displace Classic Education?

Photo of 4 elementary school children typing at desktop computers.I believe that technology is now a routine part of our lives and I have been thinking lately about how much effort we should spend educating young students about computers. I read an article that highlighted a push to make computer science mandatory in German schools. My question is, has technology become so commonplace that we treat it like running water and electricity, or can it still provide a competitive advantage for a community or a nation?

Keeping up on Technology

One of the concerns of German lawmakers, which is shared by officials from other countries, is that their students will fall behind and not be able to fill future technology jobs. According to the head of German digital industry group Bitkom:

“IT skills are now as important as the basics. Digitisation determines our everyday lives more and more, for leisure time as well as for work. Schools must teach about media literacy beyond the classroom and give students a firm grasp of IT technologies.”

Suddenly, the tech kids are the cool ones in school. This follows the recent emphasis in schools in science, technology, engineering, and math (STEM). The theory is that partly because of the proliferation of technology, the best and most advanced jobs will go to those who are trained in those areas.

In a blog post last year I highlighted the organization that believes that “every student in every school should have the opportunity to learn computer science.” They are working to increase access to computer curriculum, particularly for women and students of color. Just as the lawmakers in Germany are advocating, believes that computer science should be part of core curriculum in schools alongside biology, algebra, and chemistry. While I agree that computer science is important as part of a STEM curriculum, I wonder which classes we should drop to make room for it?

Curriculum Replacement

A recent PBS article highlighted a similar push to introduce coding courses in schools in Australia. Computer science curriculum, according to the article, will replace geography and history courses. I am sure that the change will generate a lot of debate around the virtues of a classic education versus a more modern education. It leaves the door open for ongoing conversations around curriculum mix and what students actually need to succeed in the future.


To circle back to my original question, is it necessary to add specific computer science curriculum to schools? Or has technology become so pervasive that everyone knows how to use it, but only a few need to be able to create new and unique applications? In the same vein, should we also introduce mandatory physics courses as well to better understand the underlying hardware? Finally, which courses would you replace? As you look back on your education and career, which classes have shaped you the most and why? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly

Trends in Higher Education

Stock photo of a hand about to click Go when searching with the term University.The Boston Consulting Group published an article recently that highlighted trends in higher education. This piece did a good job covering those that are emerging. I want to examine the convergence of several of them and how I think technology will play a part in shaping that future.


State colleges and universities have long relied on government subsidies to keep tuition at a manageable rate and fund all of the research and activities associated with the school. In recent years the amount of funding coming from the states has dropped as they struggle to balance their own budgets. The shortfall is made up through increased tuition and grants as well as targeted campaigns aimed at private and corporate donors. Increased tuition is problematic due to the large debt graduates are accumulating. A recent article in U.S. News & World Report detailed how some graduates are carrying student loan debt into their forties, which means they cannot help their children start academic careers. The result is that the children are assuming their own debt, which continues the cycle. Generating alternative funding sources or containing operational costs could help break that cycle.


There are more education options available to students. Schools across the country, and even some international schools, are offering attractive incentives to reel in young scholars who might otherwise attend their state university. There’s also been a spike in online curriculum and for-profit schools. In this competitive environment universities must target the right prospective students and then lure them in. With the drop in state funding mentioned above, many universities are pursuing more international students, who pay a higher tuition. All of this requires a targeted, intelligent marketing campaign.

Increased Research

Partnerships with private industry are helping universities increase their research efforts. These partners provide funds for sophisticated research, the results of which can be licensed back to the partner or sold outright. Top-notch students and faculty are drawn to such projects, industry gains new business ideas and opportunities, and students and potential employers are brought together.


Colleges and universities are facing pressure from increased competition, uncertain funding, and the push to accelerate and capitalize on research. Here are ways that I think technology can help alleviate that pressure:

  • Social Media. Universities are increasing their use of social media to reach a tech savvy generation from around the globe. Advances in web and media technologies as well as analytics help schools target the right audiences and markets.
  • Big Data and Business Analytics. The ability to quickly analyze large amounts of prospective student data helps colleges narrow their search for potential students. By identifying and targeting particular demographics, schools can reduce marketing costs and increase the efficiency of their search campaigns.
  • Collaboration Software. Research partnerships are no longer just with the company down the street. Partners can be thousands of miles away so it is important that schools and private enterprises can communicate, catalog and analyze research results in a systematic and predictable way. Collaboration applications can help keep researchers informed and successful.

While colleges and universities are facing funding and competition pressures, there are technologies that can help lessen those concerns and lead to new knowledge and discoveries. I am hoping this post spurs your thoughts on other ways that technology can or is helping higher education.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly

The Double Edged Sword of Information Availability

Photo of man using a smart phone in front of a computer.I recently came across the Harvard Genome Project. For the project, a team of Harvard researchers are collecting personal genome information to share with researchers who hope to create breakthroughs in disease eradication and prevention. It struck me that with our ability to share information and make it available to different groups, either intentionally or unintentionally, we have created a double-edged sword. On the one hand, with technology we have greatly expanded research opportunities and created the infrastructure to track down long lost relatives. On the other hand, our privacy may be jeopardized if that research information falls into the wrong hands or if a long lost relative prefers to stay lost. Is the genie out of the bottle, or are we still in control of the exabytes of information in the cloud, some of it personal?

Research for a Brighter Tomorrow

The Internet that we know today was born as the ARPANET under a contract to the United States Advanced Research Projects Agency. Its original intent was to connect research facilities to share information. In December 1969, Stanford University, University of California Santa Barbara, University of California Los Angeles, and the University of Utah were connected to collaborate and advance research. By 1971, several other prominent universities, private research firms, and government agencies had joined ARPANET, extending the geographical reach well beyond the southwestern U.S. The original Internet was intended to further scientific research, not to share cat videos. In that vein, the Harvard project exemplifies the positive aspects of information sharing.

Technology and Democracy

Before we were all connected by technology, there was radio and television, which are “one to many” media. One broadcast, such as the nightly news or a presidential fireside chat, went out to those who chose to listen or watch. There was no way to give feedback or to refute what might be misinformation. Now people around the world can share real time information on developing stories; we no longer have to wait until the five o’clock news or place complete trust in the newscaster.

We can also take on the role of broadcaster. We can participate more deeply in the democratic process by speaking out on issues of the day and join with others to have an impact on legislation that affects our lives. Whether we live in the safety of the U.S. or in a war ravaged country, we have a voice and it can be heard, thanks to technology.

The downside is the ability to spread misinformation. It is important that we choose carefully the news sources that we trust. The Onion has made a sport of parodying trending news but their articles are sometimes quoted as facts. It is up to each one of us to distinguish truth from fiction.

The Privacy Issue

I wrote a blog in July highlighting the breach of private information submitted to the website Ashley Madison. Users expected their personal information to remain private, but hackers who broke into the site published that information. This is where I wonder if the genie is out of the bottle and any information we choose to share, be it our genome data, private photos, our current location, or politically sensitive information, should be considered potentially public. Would we conduct ourselves online differently if we expected our information to go public? Would we be more careful?


Technology advances have allowed us to share research, information, product reviews, political news, or even to find each other. I believe though that with this new power and connectivity comes responsibility that we sometimes take lightly. We need to approach this new world with eyes wide open. Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly

Technology Refresh or Addiction?

Photo of cell phone chained to a human hand.Apple recently introduced the iPhone 6s and 6s Plus. At the same time they introduced a payment plan that includes automatic replacement when a newer version of the iPhone comes out, presumably once a year, and insurance should the phone break before the new model emerges. According to the Apple website:

“Getting a new iPhone every year is easy. After 12 installments, you can get a new iPhone and start a new iPhone Upgrade Program. No more waiting for your carrier contract to end. Just trade in your current iPhone for a new one, and your new program begins.”

The phone is paid off in 24 installments, which means that you always get a new phone before the old one is paid off. I have two questions: with Apple now financing unlocked phones, does this put them in the driver’s seat and push the carriers back to simply a “pipe” provider? More importantly to me, can Apple provide enough of a technology refresh and differentiation that people need a new phone every year?

Apple vs. National Carriers

The four national carriers, AT&T, Verizon, T-Mobile, and Sprint, already offer a similar refresh deal by adding a fee to the normal monthly contract. However, those phones are sold and locked by the carrier so you are bound into a contract with them. When you buy an unlocked phone from Apple you are free to move around outside of a carrier contract. If other manufacturers follow suit then that drives the carriers away from being phone stores to being monthly service providers. In other words, it relegates them to the same status as the old Regional Bell Operating Companies (RBOC) with landlines. Coincidentally, Verizon and AT&T have both grown out of the old RBOCs so we could be right back where we started. I am watching with interest to see how cell providers respond to this challenge from Apple or whether Apple at some point will make a bid to become their own cell provider, thus cutting out the carriers completely.

How Much is Too Much?

Now, the real question on my mind is this: does a person need a new phone every year and does it really make their life better? I am interested in your opinion and hope that you will chime in. In full disclosure, I usually end up with a new phone every year but that is because mine breaks. Apparently you are not supposed to take your phone kayaking or rock climbing. My replacement is usually a cheap $20 Android smart phone so I never have the latest and greatest but it does what I need it to do and it fits my frugal nature.

The latest iPhone touts a better screen, better chipset, faster Wi-Fi, new 3D-Touch, and a better camera. Are the new features that much better than the iPhone 6 and 6 Plus introduced a year ago? For some sophisticated consumers the answer is obviously yes. In a recent study by the University of Missouri, researchers found that iPhone separation in some people resulted in anxiety and poorer cognitive performance on tasks requiring close attention. It appears that for some people smart phones have become such an integral part of their lives that they need them nearby in order to perform tasks that don’t even require a smart phone. Perhaps the latest and greatest features do help us live better lives.


Psychology author Michael Clarkson provided a counter argument for constant technology refresh in a CNN iReport earlier this year, “Escaping Society and my Cell Phone.” In it, he chronicles his attempt to escape a technology filled world by spending time in his backyard fort.

Whether we refresh our smart phone every year or two or five, technology is having a real impact on how we live and how we interact with others. I believe that we need to examine our own interaction with technology to determine how much is enough and how much is too much and too often. What is the right balance for you? Is technology a tool or has it become something more? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly

Technology Trends in Law Enforcement

Photo of a police officer typing on a laptop computer.There have been a lot of technology updates in law enforcement just in the last five years. Some things such as body cameras are controversial due to privacy issues; others such as Tasers are controversial due to the potential for misuse, but can save lives when used instead of a gun to subdue a suspect. This week I will highlight a few of the newest technologies that are used on the beat and in the back room.

Body Cameras

First there were car mounted cameras, and now more officers are being outfitted with body cameras. The theory is that officers will use greater discretion in their interaction with the public if they know that their actions are recorded, and ideally the public will behave better as well. Granted, they only work if they are turned on and that is still up to the wearer, but there are also back end technology issues to deal with. The Los Angeles Police Department has approximately 9,000 officers, so if each officer recorded on average one hour a day, that would be 9,000 hours of video each day that need to be stored and catalogued. Where is that kept? On a local server or in the cloud? Who is going to extract the exact footage when questions arise? Are the videos tagged such that a query can be run to compare best practices or patterns of abuse? The initial cost of the camera is only the beginning; there are many other considerations.


Electronic control devices used by officers today hearken back to the cattle prod, which was invented in the late 1800s. Officers actually used cattle prods in the 1960s to break up unruly crowds, so the device of today is a true technological advance. The modern Taser was patented in 1974 by NASA researcher Jack Cover, for use by law enforcement. The original design used gunpowder to eject electrodes; now they use compressed air or nitrogen gas as a propulsion system. Studies show the voltage can cause cardiac arrest in some people, but the device has been used over the last forty years as an alternative to firearms. There have been concerns expressed about inappropriate use of Tasers; however, when used appropriately they can offer a non-lethal alternative to firearms.

License Plate Readers

Automatic License Plate Readers (ALPR) have been in place for close to 10 years and are installed on either police vehicles or on stationary objects such as bridges or signs. These readers take pictures of license plates at the rate of one per second on vehicles traveling up to 100 miles per hour. They commonly use infrared for night vision and the image can be compared with a database to track the movement of a vehicle. They are frequently used at toll-booths, particularly during off hours. I received a notice last year that I owed a toll for crossing the George Washington Bridge into New York and realized that it was for a vehicle registered in my name that my son was driving. When the plate image was captured, it was quickly linked to me through vehicle registration. While they are useful for such applications, there are concerns that the technology may be used to track innocent citizens. In a Wired magazine article earlier this year, the American Civil Liberties Union (ACLU) uncovered documents that show that the FBI temporarily halted purchase of these devices in 2012 due to privacy concerns. The worry is that agencies such as the FBI might use the devices, algorithms, and data analytics to track a person and even predict their future movements. This is big data analytics at work.

Social Media

Law enforcement agencies are using social media to promote a public image and to engage the public to help solve crimes and find missing persons. It is also used by agencies to track felons who are thought to be in possession of firearms or other items that put them in violation of their parole or probation. Facebook in January announced that it would include Amber Alerts in their news feed to widen the search for missing children.


New technologies enable law enforcement to do their job more efficiently and more effectively. They are still sorting out the privacy issues, but the same is true for GoPro cameras and drones. We need to be deliberate in drawing the line between protecting personal privacy and allowing the use of potentially invasive tools to protect the public and officers.

What are your thoughts? Are there other cool tools that I missed? Are we doing a good job of balancing the use of technology for the greater good and the right to personal privacy? Let me know.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Print Friendly