Tag Archives: sensor

Baseball Technology 2017

With the 2017 baseball season approaching the midway point, I have been reading about the decline of fan interest, ticket sales, and athlete recognition. An article from my local newspaper reported that not one baseball player is among the 100 most famous athletes in the world, based on endorsements, social media following, and internet search popularity; those spots are taken by soccer, tennis, football, basketball, and even a few golf stars. I wonder what technology could do, if anything, to pull baseball out of this popularity slump.

Sensors

Technology is showing up in some unusual places, including wooden baseball bats. Sensor manufacturer Zepp has teamed up with the Old Hickory bats to create a smart bat. A device is built into the knob of the bat that records data points like swing speed, angle and motion and shares that information via Bluetooth to a connected device. A visualization shows the swing and connects to previous data to compare that swing with others, which allows the player to correct any issues in order to reach maximum performance. These sensors are available for tennis rackets, golf clubs and softball bats. While smart bats are meant for improving player performance I wonder if the visualization can be shared with fans as well, perhaps on the Jumbotron, to give tech-savvy fans something to do between pitches.

Of course, technology is also used for tracking statistics on pitches. For stadiums equipped with TrackMan, fans in the stadium and at home can track live information on pitch velocity, spin and exit speed among various radar tracked data points. This is sophisticated technology but I wonder if data driven fans even need to go to the ballpark any more or can they sit at their computer and analyze every pitch and swing as it happens? Is it more important to see the action or analyze? I foresee the day when machine learning enters into baseball and a computer directs players on their next move based on historical and real time statistics. Hackers could have a field day with that interaction.

Stadium Technology

We can now track every player and every swing but that still does not get people in seats, which is a real problem in baseball today. To try to overcome that problem, stadiums are being built and retrofitted with wireless access points for between inning entertainment and high definition cameras and displays so you won’t miss any action, even if you don’t have the best seat in the house. Various baseball franchises have developed fan apps that allow you to watch instant replays and view statistics on your smartphone or tablet while in the stadium. Apps also allow you to order up snacks and have them delivered to your seat, for a premium. The stadium experience today is a combination of live action and device interaction. There are virtual reality applications in development that will allow you to get a bird’s-eye view of the action or zoom into one particular area of the field using cameras positioned around the stadium. Reality meets virtual reality.

<h4Thoughts

There are a number of new technologies introduced or in development designed to bring fans back to baseball, either in the stadium or watching at home or on a mobile device. Time will tell if they are successful but with the price tag of new stadiums, there is a lot at stake. Have you been to a live baseball game recently? How was your experience? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Artificial Intelligence Applications

Artificial intelligence (AI) will continue to contribute to innovations this year. I think some industries will embrace the change and some will resist for various reasons, including job displacement and trust. Our world is changing already in terms of the tasks that computers take on. Let’s examine some of the ways that AI will change how we work in 2017 and beyond.

Definitions

AI is simply a set of cognitive tasks that can be handled by a computer. Some AI functions incorporate vision and robotics but do not necessarily resemble Arnold Schwarzenegger’s dangerous “Terminator” character. Think of the hundreds of decisions that you make every day and which of those decisions could be best made by a computer, thus freeing you up for more creative and innovative tasks. Another term associated with AI is machine learning. That is the ability of a computer to learn from past cognitive decisions and make corrective choices, similar to how we learn from our mistakes and change our thinking in order to produce a better outcome.

Security

In a recent InformationWeek article, the author is hopeful that AI advances will help solve a skills shortage in the cyber security field. Right now, computers are used to gather data on threats and potential threats to weed out erroneous information and help security professionals formulate a mitigation strategy. In the future, the computer will also be left to formulate and institute the threat response as well as gather the initial data. Far from displacing security personnel, it will free them up to work on higher level tasks such as business continuity and refining the data collected and filtered. In this case, AI provides another pair of hands but security professionals will continue to be in as high demand as they are now.

Automotive Applications

One of the AI applications I am most excited about is automotive. I have written about this in the past and there have been some real breakthroughs recently. One practical application of AI is Ford’s new Pro Trailer Backup Assist. I cannot back up a trailer to save my life; I was denied that gene when I was born. Somehow the trailer appears at my side whenever I try to back into a spot. With backup assist, the driver removes their hands from the steering wheel completely and backs up by using a small knob on the dash. Turn the knob to the right and the trailer moves to the right. This is just the opposite of trying to use the steering wheel and certainly much more intuitive. This is an example of machine learning using vision and computing algorithms. Another even more radical example is the upcoming autonomous vehicle. These vehicles make constant decisions based on sensor input from around the vehicle to safely transport a passenger.

Danger Zones

Robots using machine learning differ from simple drones in that they make independent decisions based on past experience. A drone is controlled by a human operator and cannot function independently. An example of independent robot development is CHIMP from Carnegie Mellon University. CHIMP will be used in industrial application and for search and rescue when the situation is too dangerous for humans. It makes decisions based on instructions, experience, and multiple sensor input.

Thoughts

These are just a few AI applications, with a lot more to come. Are there tasks or decisions that you would just as soon leave to a computer? Do you trust the systems to make those decisions? This is a brave new world and it will take a leap of faith before some of these developments become completely commercialized. Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

Automotive Education of Tomorrow: Car or Computer?

Man uses a laptop computer to examine a car engine.Automobiles are becoming more reliable but are much more complicated to diagnose and repair when they do fail. With the introduction of hybrid, electric, semiautonomous, and autonomous vehicles, computer science and networking skills will be just as important to a technician as the traditional mechanical training. Let’s explore the training required to care for these high-tech vehicles.

Car or Computer?

My son is an automotive technician specializing in a high-end brand. My background in computer and information systems and his in automotive repair are starting to converge and we find ourselves talking about shared interests like networks, fiber optics, downloading patches, and diagnosing computer failures. In a Los Angeles Times article, Elon Musk, founder of Tesla, remarked “We really designed the Model S to be a very sophisticated computer on wheels. Tesla is a software company as much as it is a hardware company.” Teslas are designed to be upgraded and gain new features through wireless patch updates. In other words, they can evolve. Are new vehicles more car or computer?

Chips for the Road

Chip makers such as Intel, Xylinx, and On Semiconductor have ventured into automotive applications to supply the industry with controllers for lighting, infotainment systems, on-board computers, and sensors. These partners are using their expertise to help drive the industry’s advances.

New Sensor Technology

Technology company Nvidia announced earlier this month that they have developed the “Deep Learning Car Computer” which will provide sensors and processors to power a semiautonomous vehicle. The computer, which they claim has the processing power of 8 teraflops, or the equivalent of 150 Macbook Pros, sits in a package the size of a tablet. The system is designed to provide a 360-degree view of the terrain and landscape around a vehicle and respond faster than a human when it detects any hazards such as a large animal, pedestrian, or ball rolling into the road followed by a child. Deep learning means that the computer is continuously adding to its knowledge and detection capabilities. Nvidia is partnering with Volvo to put 100 semiautonomous vehicles on the road in Sweden in 2017. Again, who will be repairing such vehicles? Yesterday’s mechanic or tomorrow’s technician/computer science major? What does that education look like?

Education

I am starting to see more bachelor degree programs in automotive technology. These often combine courses in physics, electronics, computer systems, and drive train and engine repair. I still think there is an unfilled niche for the type of training in automotive engineering that would be a hybrid for systems designers and repairmen. Such an approach would enable the specialists to cross back and forth as their career ambitions change. It would also provide a more holistic view of design and repair and hopefully promote design for reparability.

Thoughts

In 10 years, whether we are driving cars or they are driving us, they will still need to be repaired. A technician will need to be well-versed in hardware, software, and networking. Troubleshooting will be much more complex as we deal with multiple interconnected computer systems. Just as I advise my son to keep up on the latest technologies, I would encourage anyone to look to the future as they make their educational plans.

Are the days of the shade tree mechanic gone? What kind of education do you think it will take to repair the vehicles being introduced now? Let me know your thoughts.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional and assistant professor of practice for the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.