Tag Archives: technology

Last Minute Tech Gift Ideas

Image of the outline of a Christmas gift against a bitmap background.If you have tech lovers on your gift list this year, here are a few ideas that are sure to bring them holiday cheer.

Storage

I wrote a blog last year about the capacity of a one terabyte disk drive which are becoming standard in new personal computers. I thought that a terabyte of storage should be more than enough for a lifetime of computing. I failed to take into account the rising popularity of personal networks that store not only computer files but also entertainment such as movies and music. You can now access movies and shows from your smart TV that are stored on a drive attached to your network. To accommodate your growing storage needs, Western Digital offers My Cloud Mirror, which is network attached personal cloud storage. Your files are mirrored in case of disaster and are available from your TV, computer, or mobile device. You can watch your stored movies and access your pictures and data files from anywhere. This ranges from two all the way up to 12 terabytes to keep your favorite tech person going for a long time.

Paper Airplane 2.0

PowerUp paper airplanes may be just the gift for that person who has everything. These are not the airplanes we made as kids, they are a combination of paper and technology. The basic kit comes with a small motor but you still have to supply the paper and the navigation skills. Version 3.0 comes with a Bluetooth enabled module that allows you to control the plane from your smartphone or tablet. This is a Kickstarter project that has gone into production with different products. You can also pre-order the new PowerUp FPV kit that gives you first person view of the flight through a Google Cardboard viewer. There is even a boat for the sailor on your list.

Gift For The Budding Techie

A Raspberry Pi computer is perfect for the budding techie in your life. Made by a UK educational foundation of the same name, this is basically a low cost complete computer on a circuit board. It comes with HDMI and USB ports for connecting input and output devices and can be loaded with a special version of Linux as the basic operating system. There is no disk drive, but everything can be stored on SD cards. It represents a return to basic computing and experimentation. There is an ardent worldwide fan base for this product and no shortage of ideas posted to the web, from robot controllers to music and video servers to Christmas light display hubs. The Raspberry Pi Zero starts at $5 and the Pi 2 B runs $40. I have written before about the maker movement and this gift is a wonderful way to join in the fun.

It’s All In The Gesture

Gest is a wearable device that allows you to control your computer or tablet or smartphone through hand movements. It is still in Kickstarter mode and has been successfully funded so the device can be pre-ordered now. This is an attempt to get away from the traditional keyboard or touchpad. Personally, my fingers don’t seem to be precise when using my smartphone so I am looking forward to trying one of these in the future. This could be that gift that I give to myself.

Thoughts

There are a lot of products available for your tech friends, from the inexpensive to the unaffordable. I have chosen just a few here that I think are reasonable, useful, and sometimes just plain fun. What gifts are you giving your friends this holiday season? Let me know.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

The Consumerization of IT

Photo of businesspeople using devices.It used to be that information technology was the domain of specialists. In the last 10 years, the adoption of new technology has shifted to the consumer and not the enterprise. As a result, employees who were accustomed to using technology at home pushed for adoption in the workplace. This left IT groups scrambling to adapt their policies and applications to work with consumer devices and software, not always willingly.

This consumerization of technology inspired the popularity of bring your own device (BYOD) to work. The two main concerns over this trend are first and foremost security and second, compatibility with corporate applications. While it is desirable to access data and applications anytime, anywhere, and on any device, it is not always easy or safe. In this blog post I will look at the history and future trends of the IT consumerization. Will we continue as we have, or will the enterprise once again take the lead in new technology adoption?

History

Computers were originally used in government and businesses for things such as bomb trajectory calculations in World War II, tabulating voters’ ballots for presidential elections, and organizing corporate accounting activities. Operators and programmers were in charge of running the computers and any task or requests had to be fed through them. The query results came as printouts, not displayed on a desktop screen. Even as late as the mid-1980s I remember working in a large computer room where we printed stacks of paper that were set outside the computer room to be retrieved. Only computer operators and technicians were allowed inside the room. Access to the computers was through dumb terminals as input and the generated paper results as output.

Personal Computers

Apple and other companies sold computers to hobbyists in the late 1970s. While this was technically a consumer product, it was considered a niche market. When IBM introduced the personal computer in 1981, it was targeting the corporate employee, not individual consumers. When user-friendly word processing and spreadsheet software became available, consumers began buying computers for home use.

Networking

Without connecting the home computer to the outside world, people were still left with the same problem of input and output. Input came through the keyboard or from a disk, and the output came to a printer or screen or to another disk. The disks had limited capacity so to share a program or data, one had to have multiple disks that were hopefully labeled correctly. With early dial-up modems, people could finally share information (not graphics, that would take forever) with each other. As consumer networks improved, so did our desire to connect and share things with each other and the lines between work and home began to blur.

The Tipping Point

The tipping point for the consumerization of IT came with smartphones and tablets. Laptops were certainly more mobile and could go back and forth from home to work, but the smartphone and tablet made it even easier to live in both worlds. IT departments initially rejected tablets as not being robust and secure enough for the enterprise. The smartphone was even worse because it was so portable. Blackberry was one of the pioneers in bridging the gap between corporate e-mail and information systems and consumer devices. Salespeople and executives could receive information while they were with a client instead of waiting for a computer operator to process a request. It was a whole new world that continues to evolve.

Today

In my Information Systems class we talk about Bring Your Own Device (BYOD) and the tools that we need to deploy, such as Mobile Device Management (MDM), in order to integrate consumer devices into the workplace. The key for technology departments is adaptability. The lines are blurred and the genie is not going back in the bottle so we need to make sure our data and enterprise are secure while working with these devices.

In a possible reversal of trends, Deloitte predicts what they call the re-enterprization of IT in the next few years. They point to current technologies such as wearables, 3D printing, and drones being embraced by the enterprise as evidence of that reversal. I am skeptical that the consumer trend is changing just yet but I will keep my eyes open.

Thoughts

Has the consumerization of IT helped you in your work or has it caused you pain as you deal with the consequences? I don’t miss the days of wearing a separate pager and I love being able to access data from any device at any time. I also realize the work that goes into the back end to make this access seamless and I appreciate the efforts of technologists who build bridges between consumer devices and the enterprise. Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

The End of Cyber Monday

Photo of a young woman in a clothing store looking at her phone.As I write this blog entry, we are still two weeks away from Thanksgiving, Black Friday, and Cyber Monday. While I believe the first two will continue into the future, I think the term Cyber Monday has become irrelevant, largely due to technology changes, and will end this year. In this post I will lay out my reasoning for predicting its demise and invite you to give me feedback as to whether you believe Cyber Monday is doomed.

History

The term Cyber Monday was coined in 2005 by Shop.org, the digital arm of the National Retail Federation. Shop.org also runs the website Cybermonday.com in which they invite participating retailers to share their Cyber Monday shopping deals. The term refers to the Monday after Black Friday when the most online Christmas shopping is done. It was not true in 2005 but was by 2010. Now it is only one of many large online shopping days reaching back into mid-October.

Technology

I believe that the biggest threat to Cyber Monday is technology. The theory was people would go to work on Monday after the long Thanksgiving weekend and purchase all of their remaining Christmas items online using the faster company internet connection. That is now irrelevant for two reasons:

  1. Home internet connections are now fast enough to stream digital content such as movies so they are more than adequate for shopping.
  2. More people are shopping now from a mobile device such as a smartphone or tablet so they do not need to be tied to the home or office internet connection.

The term “showrooming” was coined to define the practice of visiting a store to view merchandise before ordering it online at a lower price. Best Buy has been referred to as the showroom for Amazon. In theory, you could even stand in a brick and mortar store and order the same product online through your smartphone. I think this practice will decline as we get closer to price parity between online and traditional retailers.

Web sites and apps such as Buyvia.com and Dealnews.com have taken the steam out of Cyber Monday by advertising a wide range of retail deals 365 days a year. I can define my product search and get alerts as to the best price and retailer, regardless of whether it is on Thanksgiving weekend.

Timing is everything

Retailers are creating shopping events earlier and earlier. I can already see “leaked” Black Friday ads from several retailers even though Thanksgiving is still several days away and Christmas is more than a month away. Soon we could have our Christmas shopping done in September, eliminating the whole holiday rush of late November and early December.

Thoughts

I realize that retailers will continue to roll out special deals on certain days like Black Friday and Cyber Monday, but I think that technology advances and the way that we choose to do business will make these exclusive days less of a bargain.

Am I just being a Scrooge or am I on to something? Is technology changing how and when we shop? Has Cyber Monday become irrelevant? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

People of Ability

Group of business people assembling jigsaw puzzleIn my volunteer positions I have worked with youth of different abilities, often called disabilities. These youth may be mentally, physically, or emotionally disabled. Despite these disabilities, they contribute greatly in various ways and have taught me many lessons that I carry with me. In this blog post I will highlight some stories of people with disabilities who contribute to the field of information technology.

Meticulon Consulting

I have worked with people on the autism spectrum who are excellent programmers. They are methodical, meticulous, and often very creative. They like routine work and excel at logic challenges such as coding. In a recent article, Meticulon Consulting, a Canadian technology firm, was highlighted for hiring people with autism. Their experiences were the same as mine. The people they hired were meticulous, hard working, and loyal to the firm. Meticulon co-founder Garth Johnson makes the point that he is not hiring people with disabilities out of sympathy but because it makes good business sense. Johnson said, “I’m not interested in this as a charity. If we can’t prove business value, then I don’t view it as sustainable for our employees, either our typically enabled or our people with autism.” Other companies cited in the article are coming to the same realization. It makes good business sense to hire people with disabilities.

SAP

The German software giant SAP shares that experience. Their goal is to have one percent of their workforce from the autism community by 2020. This goal came out of a project with the Autism Society of India after SAP programmers created software to help children with autism communicate better. The project was successful so the employees proposed a pilot project to hire adults with autism. SAP recognized the fact that these new employees come with a different perspective and a fresh set of eyes. Jose Velasco, a SAP executive and head of the Autism At Work program said, “There is a skill set people on the spectrum are bringing that has business value.”

Physical Disabilities

In our AIM Program course, Information Systems and Management, we talk about the stereotype of technology workers who are more comfortable with computers than with people. Whether the stereotype is valid or not, it has nothing to do with physical abilities. I have worked with people with hearing or vision impairments or other disabilities who love technology as much as I do. An employer may need to make some accommodations for them, but in my experience it is worth the effort; they bring a rich skillset and unique perspective to a project or an organization.

Thoughts

I believe that we need contributions from people of all abilities in order to make a strong and complete team. We all bring different skills and experiences to our work so the fact that we don’t all think alike or move the same should not make a difference. I would like to hear about your experiences working with people with different abilities. Are there benefits or drawbacks? Let me know.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Back to the Future: Hits and Misses

Photo by Erin Cadigan

Photo by Erin Cadigan

In the 1985 film “Back To The Future II,” the characters create a time machine and travel 30 years into the future, to October 21, 2015. I am a fan of the trilogy so I have been thinking about how accurately they portrayed our current year. There were some hits and some misses.

Flying Cars

While we do not yet have commercially available flying cars we do have working autonomous vehicles. Google has prototypes driving on city streets in California. Toyota, Volvo, Audi, Mercedes Benz, Apple, and Tesla are also developing self-driving cars. Apparently it is easier to develop a self-driving car than a flying car. Now, if only we could develop an autonomous flying car, that would be really cool.

Hoverboards

In the movie, the main character rides a levitating skateboard, which he calls a hoverboard. We do have those, although they are not in mass production. Lexus recently demonstrated a hoverboard, partly to coincide with the date in the movie, which may be the first step toward their goal of developing a levitating car. If they succeed, we could some day have flying cars, but they wouldn’t fly high up in the air like in “Back To The Future” or “The Jetsons.”

Fax Machines in Every Room

In one scene of the movie, they show a home with multiple fax machines. I think we moved past this technology. Fax machines are still available as standalone machines or integrated into scanner/printers, but faxing has largely been replaced by other electronic communication methods. Now we have screens in every room and in every hand.

Large Screen Advertising

When the main character arrives in the future, there is outdoor advertising everywhere on large screens, almost to distraction. I think we have this one covered. I can drive down the highway now and see full color video on billboards. In 1985 who would have thought we would have 60-inch high definition televisions in our homes? In terms of screen size, we subscribe to the “bigger is better” philosophy. The largest current sports arena screen is the Jumbotron in Houston, which measures 52 feet high and 277 feet wide. We have definitely figured out how to make large displays.

Thoughts

Some things, such as flying cars, have been anticipated since the 1950s, but we haven’t quite perfected them. Other predictions are already old school. I wonder if movie scripts mimic our ingenuity and development, or is it the other way around? If we were to make a movie today portraying 2045, what would it look like? Will we all still be walking around looking at six-inch screens or will we have integrated our viewing into wearables such as glasses and holographic projections? What do you predict for the future? Let me know your thoughts and I will circle back in 2045 to see if you are right.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Will Computer Science Displace Classic Education?

Photo of 4 elementary school children typing at desktop computers.I believe that technology is now a routine part of our lives and I have been thinking lately about how much effort we should spend educating young students about computers. I read an article that highlighted a push to make computer science mandatory in German schools. My question is, has technology become so commonplace that we treat it like running water and electricity, or can it still provide a competitive advantage for a community or a nation?

Keeping up on Technology

One of the concerns of German lawmakers, which is shared by officials from other countries, is that their students will fall behind and not be able to fill future technology jobs. According to the head of German digital industry group Bitkom:

“IT skills are now as important as the basics. Digitisation determines our everyday lives more and more, for leisure time as well as for work. Schools must teach about media literacy beyond the classroom and give students a firm grasp of IT technologies.”

Suddenly, the tech kids are the cool ones in school. This follows the recent emphasis in schools in science, technology, engineering, and math (STEM). The theory is that partly because of the proliferation of technology, the best and most advanced jobs will go to those who are trained in those areas.

Code.org

In a blog post last year I highlighted the organization Code.org that believes that “every student in every school should have the opportunity to learn computer science.” They are working to increase access to computer curriculum, particularly for women and students of color. Just as the lawmakers in Germany are advocating, Code.org believes that computer science should be part of core curriculum in schools alongside biology, algebra, and chemistry. While I agree that computer science is important as part of a STEM curriculum, I wonder which classes we should drop to make room for it?

Curriculum Replacement

A recent PBS article highlighted a similar push to introduce coding courses in schools in Australia. Computer science curriculum, according to the article, will replace geography and history courses. I am sure that the change will generate a lot of debate around the virtues of a classic education versus a more modern education. It leaves the door open for ongoing conversations around curriculum mix and what students actually need to succeed in the future.

Thoughts

To circle back to my original question, is it necessary to add specific computer science curriculum to schools? Or has technology become so pervasive that everyone knows how to use it, but only a few need to be able to create new and unique applications? In the same vein, should we also introduce mandatory physics courses as well to better understand the underlying hardware? Finally, which courses would you replace? As you look back on your education and career, which classes have shaped you the most and why? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Trends in Higher Education

Stock photo of a hand about to click Go when searching with the term University.The Boston Consulting Group published an article recently that highlighted trends in higher education. This piece did a good job covering those that are emerging. I want to examine the convergence of several of them and how I think technology will play a part in shaping that future.

Funding

State colleges and universities have long relied on government subsidies to keep tuition at a manageable rate and fund all of the research and activities associated with the school. In recent years the amount of funding coming from the states has dropped as they struggle to balance their own budgets. The shortfall is made up through increased tuition and grants as well as targeted campaigns aimed at private and corporate donors. Increased tuition is problematic due to the large debt graduates are accumulating. A recent article in U.S. News & World Report detailed how some graduates are carrying student loan debt into their forties, which means they cannot help their children start academic careers. The result is that the children are assuming their own debt, which continues the cycle. Generating alternative funding sources or containing operational costs could help break that cycle.

Competition

There are more education options available to students. Schools across the country, and even some international schools, are offering attractive incentives to reel in young scholars who might otherwise attend their state university. There’s also been a spike in online curriculum and for-profit schools. In this competitive environment universities must target the right prospective students and then lure them in. With the drop in state funding mentioned above, many universities are pursuing more international students, who pay a higher tuition. All of this requires a targeted, intelligent marketing campaign.

Increased Research

Partnerships with private industry are helping universities increase their research efforts. These partners provide funds for sophisticated research, the results of which can be licensed back to the partner or sold outright. Top-notch students and faculty are drawn to such projects, industry gains new business ideas and opportunities, and students and potential employers are brought together.

Thoughts

Colleges and universities are facing pressure from increased competition, uncertain funding, and the push to accelerate and capitalize on research. Here are ways that I think technology can help alleviate that pressure:

  • Social Media. Universities are increasing their use of social media to reach a tech savvy generation from around the globe. Advances in web and media technologies as well as analytics help schools target the right audiences and markets.
  • Big Data and Business Analytics. The ability to quickly analyze large amounts of prospective student data helps colleges narrow their search for potential students. By identifying and targeting particular demographics, schools can reduce marketing costs and increase the efficiency of their search campaigns.
  • Collaboration Software. Research partnerships are no longer just with the company down the street. Partners can be thousands of miles away so it is important that schools and private enterprises can communicate, catalog and analyze research results in a systematic and predictable way. Collaboration applications can help keep researchers informed and successful.

While colleges and universities are facing funding and competition pressures, there are technologies that can help lessen those concerns and lead to new knowledge and discoveries. I am hoping this post spurs your thoughts on other ways that technology can or is helping higher education.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Technology Refresh or Addiction?

Photo of cell phone chained to a human hand.Apple recently introduced the iPhone 6s and 6s Plus. At the same time they introduced a payment plan that includes automatic replacement when a newer version of the iPhone comes out, presumably once a year, and insurance should the phone break before the new model emerges. According to the Apple website:

“Getting a new iPhone every year is easy. After 12 installments, you can get a new iPhone and start a new iPhone Upgrade Program. No more waiting for your carrier contract to end. Just trade in your current iPhone for a new one, and your new program begins.”

The phone is paid off in 24 installments, which means that you always get a new phone before the old one is paid off. I have two questions: with Apple now financing unlocked phones, does this put them in the driver’s seat and push the carriers back to simply a “pipe” provider? More importantly to me, can Apple provide enough of a technology refresh and differentiation that people need a new phone every year?

Apple vs. National Carriers

The four national carriers, AT&T, Verizon, T-Mobile, and Sprint, already offer a similar refresh deal by adding a fee to the normal monthly contract. However, those phones are sold and locked by the carrier so you are bound into a contract with them. When you buy an unlocked phone from Apple you are free to move around outside of a carrier contract. If other manufacturers follow suit then that drives the carriers away from being phone stores to being monthly service providers. In other words, it relegates them to the same status as the old Regional Bell Operating Companies (RBOC) with landlines. Coincidentally, Verizon and AT&T have both grown out of the old RBOCs so we could be right back where we started. I am watching with interest to see how cell providers respond to this challenge from Apple or whether Apple at some point will make a bid to become their own cell provider, thus cutting out the carriers completely.

How Much is Too Much?

Now, the real question on my mind is this: does a person need a new phone every year and does it really make their life better? I am interested in your opinion and hope that you will chime in. In full disclosure, I usually end up with a new phone every year but that is because mine breaks. Apparently you are not supposed to take your phone kayaking or rock climbing. My replacement is usually a cheap $20 Android smart phone so I never have the latest and greatest but it does what I need it to do and it fits my frugal nature.

The latest iPhone touts a better screen, better chipset, faster Wi-Fi, new 3D-Touch, and a better camera. Are the new features that much better than the iPhone 6 and 6 Plus introduced a year ago? For some sophisticated consumers the answer is obviously yes. In a recent study by the University of Missouri, researchers found that iPhone separation in some people resulted in anxiety and poorer cognitive performance on tasks requiring close attention. It appears that for some people smart phones have become such an integral part of their lives that they need them nearby in order to perform tasks that don’t even require a smart phone. Perhaps the latest and greatest features do help us live better lives.

Thoughts

Psychology author Michael Clarkson provided a counter argument for constant technology refresh in a CNN iReport earlier this year, “Escaping Society and my Cell Phone.” In it, he chronicles his attempt to escape a technology filled world by spending time in his backyard fort.

Whether we refresh our smart phone every year or two or five, technology is having a real impact on how we live and how we interact with others. I believe that we need to examine our own interaction with technology to determine how much is enough and how much is too much and too often. What is the right balance for you? Is technology a tool or has it become something more? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Technology Trends in Law Enforcement

Photo of a police officer typing on a laptop computer.There have been a lot of technology updates in law enforcement just in the last five years. Some things such as body cameras are controversial due to privacy issues; others such as Tasers are controversial due to the potential for misuse, but can save lives when used instead of a gun to subdue a suspect. This week I will highlight a few of the newest technologies that are used on the beat and in the back room.

Body Cameras

First there were car mounted cameras, and now more officers are being outfitted with body cameras. The theory is that officers will use greater discretion in their interaction with the public if they know that their actions are recorded, and ideally the public will behave better as well. Granted, they only work if they are turned on and that is still up to the wearer, but there are also back end technology issues to deal with. The Los Angeles Police Department has approximately 9,000 officers, so if each officer recorded on average one hour a day, that would be 9,000 hours of video each day that need to be stored and catalogued. Where is that kept? On a local server or in the cloud? Who is going to extract the exact footage when questions arise? Are the videos tagged such that a query can be run to compare best practices or patterns of abuse? The initial cost of the camera is only the beginning; there are many other considerations.

Tasers

Electronic control devices used by officers today hearken back to the cattle prod, which was invented in the late 1800s. Officers actually used cattle prods in the 1960s to break up unruly crowds, so the device of today is a true technological advance. The modern Taser was patented in 1974 by NASA researcher Jack Cover, for use by law enforcement. The original design used gunpowder to eject electrodes; now they use compressed air or nitrogen gas as a propulsion system. Studies show the voltage can cause cardiac arrest in some people, but the device has been used over the last forty years as an alternative to firearms. There have been concerns expressed about inappropriate use of Tasers; however, when used appropriately they can offer a non-lethal alternative to firearms.

License Plate Readers

Automatic License Plate Readers (ALPR) have been in place for close to 10 years and are installed on either police vehicles or on stationary objects such as bridges or signs. These readers take pictures of license plates at the rate of one per second on vehicles traveling up to 100 miles per hour. They commonly use infrared for night vision and the image can be compared with a database to track the movement of a vehicle. They are frequently used at toll-booths, particularly during off hours. I received a notice last year that I owed a toll for crossing the George Washington Bridge into New York and realized that it was for a vehicle registered in my name that my son was driving. When the plate image was captured, it was quickly linked to me through vehicle registration. While they are useful for such applications, there are concerns that the technology may be used to track innocent citizens. In a Wired magazine article earlier this year, the American Civil Liberties Union (ACLU) uncovered documents that show that the FBI temporarily halted purchase of these devices in 2012 due to privacy concerns. The worry is that agencies such as the FBI might use the devices, algorithms, and data analytics to track a person and even predict their future movements. This is big data analytics at work.

Social Media

Law enforcement agencies are using social media to promote a public image and to engage the public to help solve crimes and find missing persons. It is also used by agencies to track felons who are thought to be in possession of firearms or other items that put them in violation of their parole or probation. Facebook in January announced that it would include Amber Alerts in their news feed to widen the search for missing children.

Thoughts

New technologies enable law enforcement to do their job more efficiently and more effectively. They are still sorting out the privacy issues, but the same is true for GoPro cameras and drones. We need to be deliberate in drawing the line between protecting personal privacy and allowing the use of potentially invasive tools to protect the public and officers.

What are your thoughts? Are there other cool tools that I missed? Are we doing a good job of balancing the use of technology for the greater good and the right to personal privacy? Let me know.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

High Tech Fire Watch

Photograph of smoke from wildfire in the mountains.We are in the middle of fire season here in the Northwest. This has been a hot, dry summer so the threat of wildfire is great. Several of my friends have worked on fire crews at some point so I wondered about the role technology plays in fighting wildfires. I was delighted to find that someone had blazed that trail before me and technology plays a role not only in fire fighting but also in fire protection. In this blog post I will focus on technology in fire protection. I will dedicate an upcoming post to technology in fire fighting.

Eye In The Sky

I was amazed to find that many of the rustic fire towers perched on mountaintops in California, Oregon, and Washington are decommissioned. In a recent article in Outside magazine the authors report that fewer than 35% of the towers are still manned. Due to budget cuts, fire watchers have largely been replaced by a network of cameras. According to the article, a camera can spot a fire up to 100 miles away and can spot fires at night through near infrared vision.

ForestWatch

Oregon has a network of cameras called ForestWatch by Envirovision Solutions. These cameras are networked to provide coverage over the most fire prone areas of the state. They are all monitored remotely and can detect a change in the terrain from a digital model. Through mathematical algorithms, the cameras send an alarm when it detects anomalies or pattern differences such as fire or smoke. The remote monitoring station can then focus the camera or cameras on the suspicious area and collect GPS coordinates in case they need to send in a ground or air crew. Fires are spotted quicker and their specific location is known much faster, which may reduce the spread and damage of a fire.

Education

This is a great use of technology but what kind of education does it take to install, program, and monitor these cameras? My research shows knowledge in the following areas is required:

GIS—A strong background in geographical information systems (GIS). This includes mapping and data analysis.

Data modeling—A strong background in data modeling and database management. There are many data points involved here, from GPS coordinates to topographical data to wind speed to moisture index, and they all need to be combined and modeled to show the monitor what fire crews will encounter.

Wireless networking—These cameras are networked to the central monitoring station and often to each other. In a suspected fire, multiple cameras from various angles can verify the validity of the alarm. A person would need a strong background in wireless networking to establish and maintain these cameras.

Thoughts

Fire watch cameras are a good use of technology and a reminder that new jobs often require a strong education in math and science as well as specific technical skills. As the technology moves from human fire watchers to sophisticated data collecting cameras, we must continue updating our education to be prepared for these jobs of the 21st century.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.