It started as Hackathon idea.

17

October

2018

No ratings yet.

You have probably by now heard about Microsoft’s effort to introduce AI into different applications and markets. One of these approaches is the Launch of the ‘Seeing AI’ app, promoted as “Turning the visual world into an audible experience.” (Microsoft, 2018).

This app enables visually impaired people to get an understanding about what is happening around them, by having their phone tell them what it sees. Enabled through visual recognition and artificial intelligence, the phone is able to recognize most of the objects or people around you. If the phone is pointed at people, it will even tell you, whether the person is in a good a bad mood (Wilson, 2018).

The main functionalities of the app are describing places, situations, people and their emotions; reading text from signs or documents to the user, and even reading handwriting; recognizing currencies; recognizing household products by scanning barcodes; identifying colours or the brightness of the surrounding area and guiding the application user to move the phone in the correct direction, so the object of interest is fully in-frame (Hollander, 2017). These activities can enrich the life of a visually impaired person immensely. As example, one of the developers of the app, Saqib Shaikh, who is visually impaired himself, explains nicely, how he can use the functionality of the app at work, for example to know during meetings whether people are paying attention or not (Microsoft, 2017).

The Seeing AI app, however, is still in a beta status. When trying it myself with friends, we ended up gaining a few years, while having the app estimate our age. Well, at least the rough description of our appearance was correct. It certainly is fun to try, and I believe that the app can be of big help to people who are visually impaired. What is your opinion about it? Do you see a future for these kinds of digital assistants?

If you would like to get to know more about the app and its functionalities, check out this video:

 

 

Sources:

Hollander, R. (2017), Microsoft’s newest App uses AI to narrate the world. Retrieved on 16.10.2018, from https://www.businessinsider.com/microsoft-seeing-ai-narrate-the-world-2017-7?international=true&r=US&IR=T

Microsoft (2018). Seeing AI. Retrieved on 16.10.2018, from https://www.microsoft.com/en-us/seeing-ai

Microsoft (2017). Microsoft’s Seeing AI app for visually impaired people released in the UK. Retrieved on 17.10.2018, from https://news.microsoft.com/en-gb/2017/11/15/microsofts-seeing-ai-app-for-visually-impaired-people-released-in-the-uk/

Wilson, M. (2018), A phone app for the visually impaired. Retrieved on 16.10.2018, from https://www.fastcompany.com/90227596/a-phone-app-for-the-visually-impaired

Please rate this

A taxi centre in the middle of the production hall

6

September

2018

No ratings yet.

When talking about a supermarket, you would assume, we are talking about ‘Albert Heijn’ or another store, selling groceries. Not so, when discussing the supermarket, installed in production hall A11 at the Audi production plant in Neckarsulm. Here, all parts for the assembly of the A8 are stored and, in accordance with the production plan, prepared for pick up. Logistic employees collect the necessary parts from the supermarket and place these in boxes, stored on special wagons, ready to be delivered to the production line.

This is, when it gets interesting: Automated Guided Vehicles (AGV) transport these wagons to their destination within the production line.

Audi argues, that due to the higher complexity of the new A8, these AGVs relieve the workload of their logistic employees, instead of taking over their jobs. The 30 AGVs in the A8 production hall travel an average of 170km per shift, at a maximum speed of 3.6km/h, for safety reasons.

The AGVs are connected via WIFI with the so-called fleet-manager, together forming the Driverless Transport System (DTS). The vehicles, equipped with GPS and an integrated map of the production plant, as well as two laser scanners to orientate within the facilities, use intelligent navigation software to deliver the parts precisely on time to the recipient. Similar like a taxi service, the AGV, closest to the next job, is ordered for delivery. The DTS is connected to an elevator, so that the vehicles can move independently through the different levels of the production plant.

Due to its machine learning capabilities, the DTS is able to figure out optimal routes for each job, and knows, when to send the AGVs back to its charging mat in order to recharge. The AGVs recognize size and weight of the loaded wagon and adapt the specific safety field (if an obstacle is detected within this field, the AGV stops immediately). This enables, that humans and AGVs can both make use of the same paths, without risking collisions.

Seeing this technology successfully working directly together with humans opens questions about future applications in our day-to-day life: Could we imagine having robots delivering us food and medication in hospitals, instead of nurses? Would we mind accepting a parcel from a DTS, rather than from the post employee? Or are we not willing to sacrifice these moments of social contact, in order to experience potentially higher efficiency?

 

Sources:

Ludwig, C. (2018). Audi’s logistics part 1: Prepared for a new reality. Retrieved 06 September 2018, from https://automotivelogistics.media/intelligence/130890

Först, L. (2018). That’s how a supervisor for driverless transport systems works. Retrieved 06 September 2018, from https://blog.audi.de/thats-how-a-supervisor-for-driverless-transport-systems-works/?lang=en

Först, L. (2018). Driverless transport vehicles in use for the Audi A8. Retrieved 06 September 2018, from https://blog.audi.de/driverless-transport-vehicles-in-use-for-the-audi-a8/?lang=en

Please rate this