Posts

TESLA – Artificial Intelligence & Autopilot – Tesla Bot

We develop and deploy autonomy at scale in vehicles, robots and more. We believe that an approach based on advanced AI for vision and planning, supported by efficient use of inference hardware, is the only way to achieve a general solution for full self-driving and beyond.

Hardware

Build silicon chips that power our full self-driving software from the ground up, taking every small architectural and micro-architectural improvement into account while pushing hard to squeeze maximum silicon performance-per-watt. Perform floor-planning, timing and power analyses on the design. Write robust, randomized tests and scoreboards to verify functionality and performance. Implement compilers and drivers to program and communicate with the chip, with a strong focus on performance optimization and power savings. Finally, validate the silicon chip and bring it to mass production.

Neural Networks

Apply cutting-edge research to train deep neural networks on problems ranging from perception to control. Our per-camera networks analyze raw images to perform semantic segmentation, object detection and monocular depth estimation. Our birds-eye-view networks take video from all cameras to output the road layout, static infrastructure and 3D objects directly in the top-down view. Our networks learn from the most complicated and diverse scenarios in the world, iteratively sourced from our fleet of nearly 1M vehicles in real time. A full build of Autopilot neural networks involves 48 networks that take 70,000 GPU hours to train . Together, they output 1,000 distinct tensors (predictions) at each timestep.

Autonomy Algorithms

Develop the core algorithms that drive the car by creating a high-fidelity representation of the world and planning trajectories in that space. In order to train the neural networks to predict such representations, algorithmically create accurate and large-scale ground truth data by combining information from the car’s sensors across space and time. Use state-of-the-art techniques to build a robust planning and decision-making system that operates in complicated real-world situations under uncertainty. Evaluate your algorithms at the scale of the entire Tesla fleet.

Code Foundations

Throughput, latency, correctness and determinism are the main metrics we optimize our code for. Build the Autopilot software foundations up from the lowest levels of the stack, tightly integrating with our custom hardware. Implement super-reliable bootloaders with support for over-the-air updates and bring up customized Linux kernels. Write fast, memory-efficient low-level code to capture high-frequency, high-volume data from our sensors, and to share it with multiple consumer processes— without impacting central memory access latency or starving critical functional code from CPU cycles. Squeeze and pipeline compute across a variety of hardware processing units, distributed across multiple system-on-chips.

Evaluation Infrastructure

Build open- and closed-loop, hardware-in-the-loop evaluation tools and infrastructure at scale, to accelerate the pace of innovation, track performance improvements and prevent regressions. Leverage anonymized characteristic clips from our fleet and integrate them into large suites of test cases. Write code simulating our real-world environment, producing highly realistic graphics and other sensor data that feed our Autopilot software for live debugging or automated testing.

Tesla Bot

Develop the next generation of automation, including a general purpose, bi-pedal, humanoid robot capable of performing tasks that are unsafe, repetitive or boring. We’re seeking mechanical, electrical, controls and software engineers to help us leverage our AI expertise beyond our vehicle fleet.

ARTIFICIAL INTELLIGENCE – HENNESSY X REFIK ANADOL

The Hennessy V.S.O.P blend is the expression of eight generations of Master Blenders know-how. To perpetuate the legacy of the original Hennessy V.S.O.P Privilège, Hennessy Master Blenders have constantly sought to create a completely harmonious blend: it is the definitive expression of a perfectly balanced cognac. Based on a selection of firmly structured eaux-de-vie, aged largely in partially used barrels in order to take on subtle levels of oak tannins, this highly characterful cognac reveals balanced aromas of fresh vanilla, cinnamon and toasty notes, all coming together with a seamless perfection.

 

FEATURE STORY

HENNESSY x REFIK ANADOL

Hennessy teams with the international acclaimed artist and director Refik Anadol to reveal the emotion behind Hennessy V.S.O.P Privilège.

ARTIST

A media artist

Refik Anadol is a media artist, director and pioneer in the aesthetics of data and machine intelligence. His body of work locates creativity at the intersection of humans and machines. In taking the data that flows around us as the primary material and the neural network of a computerized mind as a collaborator, Anadol paints with a thinking brush, offering us radical visualizations of our digitized memories. Anadol’s site-specific AI data sculptures, live audio/visual performances, and immersive installations take many forms, while encouraging us to rethink our engagement with the physical world, its temporal and spatial dimensions, and the creative potential of machines.

“For me, data is a memory, and memory is heritage. And, I’m trying to find these collective memories for humanity which would represent heritage for humanity. So, I think there’s a common respect we have for heritage when thinking about and producing experiences. The other thing is caring about the uniqueness, and craftsmanship – that’s something I respect a lot”, said Refik Anadol.

His inspiration

“My initial inspiration to collaborate with Hennessy came from the people Hennessy had previously collaborated with, including Frank Gehry and Ridley Scott. Hennessy cares about the values of creation, the values of imagination, but also how to preserve uniqueness and freshness. This heritage (and that keyword is my true inspiration), and the people that Hennessy collaborates with who are also my heroes, as well as the opportunity to imagine something that resides in the same space were all things that I factored in”, said Refik Anadol.

Advertisements

COLLABORATION

REFIK ANADOL IN COGNAC

“When I went to the Château de Bagnolet, I was convinced I could create something fresh because the place is about memories and dreams. Then, when I saw the cellars, and experienced the smell… I don’t know exactly how to describe that feeling. In the world of technology you never feel Time. But when you go back to the history, it hits you: it’s just human inspiration.” It marked the first time in the Maison’s history that an artist was allowed to capture that time-honored ritual real-time via neuroscientific research methods, and to use the collected data in collaboration with machine intelligence to create an unprecedented work of art.

THE ARTWORK

Using 3D data mapping, Refik Anadol interpreted and transcribed the Tasting Committee’s emotions into the color, shapes, reliefs and textures that appear on the 2021 Hennessy V.S.O.P Privilège Limited Edition. What was once an invisible sensory experience has suddenly become tangible: the power of balance appears in a harmonious and poetic surface design. Data becomes art in a visual metaphor for a blend; like the cognac itself, Sense of Heritage, the artwork, is designed to be appreciated on an individual, sensorial level.

DESIGN

STORIES

WALL OF STORIES

A media artist

ARTIST REFIK ANADOL TURNS HENNESSY V.S.O.P PRIVILÈGE COGNAC DECANTER INTO AR

HENNESSY’S COLLABORATION WITH REFIK ANADOL: BLENDING ART AND SCIENCE

PRESERVING HERITAGE IN THE AGE OF ARTIFICIAL INTELLIGENCE

 

EQS with unique MBUX Hyperscreen: the big in-car cinema: An assistant for the driver and front passenger who is constantly learning, thanks to artificial intelligence

intelligencintelligence

Visually impressive, radically easy to operate and extremely eager to learn: the MBUX Hyperscreen is one of the highlights in the EQS.  It represents the emotional intelligence of the all-electric upper-class model: The large, curved screen unit stretches almost the entire width from the left to the right A-pillar. In addition to its sheer size, the high-quality, detail-loving design also provides a “wow” effect. This aesthetic high-tech look is the emotional dimension of the MBUX hyperscreen. Added to this is artificial intelligence (AI): With software capable of learning, the display and operating concept adapts completely to its user and makes personalised suggestions for numerous infotainment, comfort and vehicle functions. Thanks to the so-called zero layer, the user does not have to scroll through submenus or give voice commands. The most important applications are always offered in a situational and contextual way at the top level in view. In this way, numerous operating steps are taken away from the EQS driver. And not only him: The MBUX Hyperscreen is also an attentive assistant for the passenger. It receives its own display and operating area.

MBUX (Mercedes-Benz User Experience) has radically simplified the operation of a Mercedes-Benz. Unveiled in 2018 in the current A-Class, there are now more than 1.8 million Mercedes-Benz passenger cars equipped with it on the roads worldwide. The Van division is also relying on MBUX. A few months ago the second generation of this learn-capable system debuted in the new S-Class. The next big step now follows in the form of the new EQS and the optionally available MBUX Hyperscreen.

“With our MBUX Hyperscreen, a design vision becomes reality” says Gorden Wagener, Chief Design Officer Daimler Group. “We merge technology with design in a fascinating way that offers the customer unprecedented ease of use. We love simplicity, we have reached a new level of MBUX.”

“The MBUX Hyperscreen is both the brain and nervous system of the car”, says Sajjad Khan, Member of the Board of Management of Mercedes-Benz AG and CTO. “The MBUX Hyperscreen continually gets to know the customer better and delivers a tailored, personalised infotainment and operating offering before the occupant even has to click or scroll anywhere.”

Electrifying appearance with emotional visualization

The MBUX Hyperscreen is an example of digital/analogue design fusion: several displays appear to blend seamlessly, resulting in an impressive, curved screen band. Analogue air vents are integrated into this large digital surface to connect the digital and physical world.

The MBUX Hyperscreen is surrounded by a continuous plastic front frame. Its visible part is painted in an elaborate three-layer process in “Silver Shadow”. This coating system achieves a particularly high-quality surface impression due to extremely thin intermediate layers. The integrated ambient lighting installed in the lower part of the MBUX Hyperscreen makes the display unit appear to float on the instrument panel.

The passenger also has its own display and operating area, which makes travel more pleasant and entertaining. With up to seven profiles, it is possible to customize the content. However, the entertainment functions of the passenger display are only available during the journey within the framework of the country-specific legal regulations. If the passenger seat is not occupied, the screen becomes a digital decorative part. In this case, animated stars, i.e. the Mercedes-Benz Pattern, are displayed.

For a particularly brilliant image, OLED technology is used in central and passenger displays. This is where the individual image points are self-luminous; non-controlled image pixels remain switched off, which means that they appear deep black. The active OLED pixels, on the other hand, radiate with high color brilliance, resulting in high contrast values, regardless of the angle of view and the lighting conditions.

This electrifying display appearance goes hand in hand with emotionally appealing visualisation. All the graphics are styled in a new blue/orange colour scheme throughout. The classic cockpit display with two circular instruments has been reinterpreted with a digital laser sword in a glass lens.

Thanks to its clear screen design with anchor points, the MBUX Hyperscreen is intuitive and easy to operate. An example of this is the display style EV mode. Important functions of the electric drive such as boost or recuperation are visualized in a new way, with a spatially moving clasp, and thus made tangible. A lens-shaped object moves between these clamps. It follows gravity and thus depicts the G-Force forces impressively and emotionally.

Personalised suggestions with the aid of artificial intelligence

Infotainment systems offer numerous and comprehensive functions. Several operating steps are often required to control them. In order to further reduce these interaction steps, Mercedes-Benz has developed a user interface with context-sensitive awareness with the help of artificial intelligence.

The MBUX system proactively displays the right functions at the right time for the user, supported by artificial intelligence (see below for examples). The context-sensitive awareness is constantly optimised by changes in the surroundings and user behaviour. The so-called zero-layer provides the user at the top level of the MBUX information architecture with dynamic, aggregated content from the entire MBUX system and related services.

Mercedes-Benz has investigated the usage behaviour of the first MBUX generation. Most of the use cases fall in the Navigation, Radio/Media and Telephony categories the navigation application is therefore always at the center of the screen unit with full functionality.

Over 20 further functions – from the active massage programme through the birthday reminder, to the suggestion for the to-do list – are automatically offered with the aid of artificial intelligence when they are relevant to the customer. “Magic Modules” is the in-house name the developers have given to these suggestion modules, which are shown on the zero-layer.

Here are four use cases. The user can accept or reject the respective suggestion with just one click:

  • If you always call a certain friend on the way home on Tuesday evenings, you will be asked to make a corresponding call on that day of the week and at this time of day. A business card with its contact information appears, and – if stored – its image appears. All MBUX suggestions are linked to the user’s profile. If someone else drives the EQS on a Tuesday night, this recommendation is not made – or there is another, depending on the preferences of the other user.
  • If the EQS driver regularly uses the massage function according to the hot stone principle in winter, the system learns and automatically suggests the comfort function in wintry temperatures.
  • If the user regularly switches on the heating of the steering wheel and other surfaces for seat heating, for example, this is suggested to him as soon as he presses the seat heating.
  • The chassis of the EQS can be lifted to provide more ground clearance. A useful function for steep garage entrances or sleep policemen. MBUX remembers the GPS position at which the user made use of the “Vehicle Lift-Up” function. If the vehicle approaches the GPS position again, MBUX independently proposes to lift the EQS.

Interesting facts & figures

With the MBUX Hyperscreen, several displays appear to merge seamlessly, resulting in an impressive 141-centimetre wide and curved screen band. The area that passengers can experience is 2,432.11 cm2.

The large glass cover display is curved three-dimensionally in the moulding process at temperatures of approx. 650°C. This process allows a distortion-free view of the display unit across the entire width of the vehicle, irrespective of the display cover radius.

To get to the most important applications, the user must scroll through 0 menu levels. That’s why Mercedes-Benz calls this zero layer.

There are a total of 12 actuators beneath the touchscreen for haptic feedback during operation. If the finger touches certain points there, they trigger a tangible vibration in the cover plate.

Two coatings of the cover plate reduce reflections and make cleaning easier. The curved glass itself consists of particularly scratch-resistant aluminium silicate.

The safety measures include predetermined breaking points alongside the side outlet openings as well as five holders which can yield in a targeted manner in a crash thanks to their honeycomb structure.

8 CPU cores, 24-gigabyte RAM and 46.4 GB per second RAM memory bandwidth are some of the MBUX technical specifications.

With the measurement data of a 1 multifunction camera and also 1 light sensor the brightness of the screen is adapted to the ambient conditions.

With up to seven profiles, the display section can be individualised for the front passenger.

Companies collaborate to make video analytics solutions more accessible in order to drive better business outcomes

Sony Semiconductor Solutions (Sony) and Microsoft Corp. (Microsoft) today announced they are partnering to create solutions that make AI-powered smart cameras and video analytics easier to access and deploy for their mutual customers.

As a result of the partnership, the companies will embed Microsoft Azure AI capabilities on Sony’s intelligent vision sensor IMX500, which extracts useful information out of images in smart cameras and other devices. Sony will also create a smart camera managed app powered by Azure IoT and Cognitive AIServices that complements the IMX500 sensor and expands the range and capability of video analytics opportunities for enterprise customers. The combination of these two solutions will bring together Sony’s cutting-edge imaging and sensing technologies, including the unique functionality of high-speed edge AI proAIcessing, with Microsoft’s cloud expertise and AI platform to uncover new video analytics opportunities for customers and partners across a variety of industries.

“By linking Sony’s innovative imaging and sensing technology with Microsoft’s excellent cloud AI services, we will deliver a powerful and convenient platform to the smart camera market. Through this platform, we hope to support the creativity of our partners and contribute to overcoming challenges in various industries,” said Terushi Shimizu, Representative Director and President, Sony Semiconductor Solutions Corporation.

“Video analytics and smart cameras can drive better business insights and outcomes across a wide range of scenarios for businesses,” said Takeshi Numoto, corporate vice president and commercial chief marketing officer at Microsoft. “Through this partnership, we’re combining Microsoft’s expertise in providing trusted, enterprise-grade AI and analytics solutions with Sony’s established leadership in the imaging sensors market to help uncover new opportunities for our mutual customers and partners.”

Video analytics has emerged as a way for enterprise customers across industries to uncover new revenue opportunities, streamline operations and solve challenges. For example, retailers can use smart cameras to detect when to refill products on a shelf or to better understand the optimal number of available open checkout counters according to the queue length. Additionally, a manufacturer might use a smart camera to identify hazards on its manufacturing floor in real time before injuries occur. Traditionally, however, such applications — which rely on gathering data distributed among many smart cameras across different sites like stores, warehouses and distribution centers — struggle to optimize the allocation of compute resources, resulting in cost or power consumption increases.

To address these challenges, Sony and Microsoft will partner to simplify access to computer vision solutions by embedding Azure AI technology from Microsoft into Sony’s intelligent vision sensor IMX500 as well as enabling partners to embed their own AI models. This integration will result in smarter, more advanced cameras for use in enterprise scenarios as well as a more efficient allocation of resources between the edge and the cloud to drive cost and power consumption efficiencies.

Sony’s smart camera managed app powered by Azure is targeted toward independent software vendors (ISVs) specializing in computer vision and video analytics solutions, as well as smart camera original equipment manufacturers (OEMs) aspiring to add value to their hardware offerings. The app will complement the IMX500 sensor and will serve as the foundation on which ISVs and OEMs can train AI models to create their own customer- and industry-specific video analytics and computer vision solutions that address enterprise customer demands. The app will simplify key workflows and take reasonable security measures designed to protect data privacy and security, allowing ISVs to spend less time on routine, low-value integration and provisioning work and more time on creating unique solutions to meet customers’ demands.

It will also enable enterprise customers to more easily find, train and deploy AI models for video analytics scenarios.

As part of the partnership, Microsoft and Sony will also work together to facilitate hands-on co-innovation with partners and enterprise customers in the areas of computer vision and video analytics as part of Microsoft’s AI & IoT Insider Labs program. Microsoft’s AI & IoT Insider Labs offer access and facilities to build, develop, prototype and test customer solutions, working in partnership with Microsoft experts and other solution providers like Sony. The companies will begin working with select customers within these co-innovation centers later this year.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

About Sony Semiconductor Solutions

Sony Semiconductor Solutions Corporation is the global leader in image sensors. We strive to provide advanced imaging technologies that bring greater convenience and joy to people’s lives. In addition, we also work to develop and bring to market new kinds of sensing technologies with the aim of offering various solutions that will take the visual and recognition capabilities of both human and machines to greater heights. For more information, please visit: https://www.sony-semicon.co.jp/e/.