Deploying Machine Learning models to the real world is prone to uncover domain coverage issues. One way to robustify the models is by generating unseen data, which the model is expected to work on. So let’s see how property based testing can solve this issue!Read More
There is the old saying that you ought to not look in the kitchen when you go to some restaurants or eateries for a bite to eat. When you see how the food is being prepared, it might just make you sick to your stomach. You can apply this rule to just about any entity that makes any kind of product. Imagine if a toaster was being manufactured in a faulty manner and was likely to catch fire when put into use. We would undoubtedly welcome having an insider that worked in the company making the toaster come forward beforehand.Read More
Argo AI, Ford, and Lyft form the first large-scale collaboration in the US as a self-driving developer, a carmaker, and a ride-hailing company comes together. They will provide robotaxi service to Lyft customers in Miami and Austin. Which are expected to start by later this year in Miami and by next year in Austin.Read More
New information and solutions bring new experiences to users, and these solutions or information are beneficial to both the user and the provider. The taste of the coming exciting news is the main factor keeping users on the path to following new services. For example, suppose you are looking to travel and determine your destination. In that case, you may be interested in all possible ways to answer your question, and depending on the searches done in the past, different types of travel, such as train or flight, can be realized. This piece of information has a significant impact on your decisions.Amazon and Netflix are working hard to personalize marketing to provide the best experience to their customers. Machine learning is a way to automate the personalization process using all available data from one customer and all customers to serve other customers.Personalization aims to tailor the process to each individual, and a machine learning model can accelerate and optimize this process by improving its model for each characteristic.Speech recognitionArtificial intelligence algorithms have many new applications in the automotive industry to assist passengers and drivers in using multimedia or navigation or other areas such as perception and behavior planning. Passengers or drivers like to take advantage of personalization and enjoy automating their habits and knowing the experiences of other passengers or drivers in the same situation.One of the most basic personalization applications in the car is recognizing the driver’s speech as the owner or regular user or passengers who use the car regularly. Providing special features or desires of a specific driver or user makes travel more enjoyable and saves time and money. If all user information is available on each car, this possibility can be extended to all users. This means that everyone has a secure key to enable speech recognition that is accessible everywhere.Personalization in servicesArtificial intelligence can recommend products or services to customers based on customer profiles. All available and real-time updated customer profiles can help provide services or products that are very useful to customers. For example, you have added new goods to your shopping list, and while you are driving in the city, AI can keep you informed of the nearest stores with the best-priced goods on your list. If AI is aware of your needs and interests, there are many personalization applications for AI. This kind of support is usually what a good friend can do for us, or at least you need time to search and find information about your needs.Data privacy is an issue that can hinder the use of artificial intelligence as your assistant in the use of any technology. Knowing more about us by AI can be a problem because we do not know who can access our data. However, as more and more applications change our lives, we will likely accept AI access to some of our private data in some way, provided the benefits to us are significant.Risk analysis and assessmentAutonomous driving requires analyzing large amounts of data and predicting and deciding a proper behavior even better than a human driver. For this reason, the safety of autonomous vehicles is still a critical factor in this technology and will determine whether the technology is mature enough to be launched or not.Safety experts are aware of this sensitivity and are looking for new solutions. Hazard analysis cannot be performed, as we did before, at design time, as a large amount of uncertainty is in front of an autonomous vehicle. Risks must be analyzed and mitigated at runtime. Personalization could simplify this solution for Level 3 autonomy, where the driver and vehicle share the responsibility for driving. Artificial intelligence algorithms collect information about drivers’ behavior and reactions while driving, and classify drivers and understand which driving tasks should be most controlled. Algorithms can check whether the driver needs to pay more attention to his driving in certain situations.E2E predictive maintenanceCar troubleshooting allows you to track the car in real-time, identify breakdowns and notify the driver to take the necessary action. For this purpose, a large amount of vehicle data must be stored and analyzed by the AI algorithm. The vehicle informs the driver about possible misbehavior of hardware or software and the driver’s actions to resolve the situation.To provide end-to-end (E2E) predictive maintenance, AI prediction algorithms require large amounts of data from the vehicle, the owner, and how the owner uses the car. This application is where AI comes in to personalize customers and analyze each vehicle individually. Exchanging real-time experience between customers for a common problem can be another application based on the analysis of individual vehicles. Some applications are not supported and even driven by vehicle manufacturers, but they can be very reasonable and attractive from the customer’s point of view.Summing up,Artificial intelligence plays an important role in the automotive industry, from control responsibilities to perform driving tasks to achieve Level 4 and 5 autonomy to applications such as monitoring the driver’s behavior and eyes to ensure that the driver is ready to take over the driving task from AI in level 3 autonomy. Personalization of AI will be an essential part of this transformation, and it is our job to decide for which responsibilities we will use the benefits of AI and which we will not. Unfortunately, this question is not easy to answer because many of the technological, ethical, safety, and data security aspects of data need to be explored to find the best solutions.Read More
This post was originally published by Allison Proffitt at AI Trends The use of confidential computing for the AI self-driving car fleet cloud could make it more difficult for hackers to launch a cyberattack. (Credit:…Read More
Algolux, a computer vision startup that builds software for advanced driver assistance systems (ADAS) and for autonomous vehicles, has secured $18.4 million in new Series B funding from a group of investors that includes General Motors’ investment division, GM Ventures.
The new funding, which raises the Montreal, Canada-based company’s total funding to $36.8 million so far, was co-led by investors Forte Ventures and Drive Capital. Other investors include Investissement Quebec, Castor Ventures, Nikon-SBI Innovation Fund, GM Ventures, Generation Ventures and Intact Ventures.
The fresh influx of cash will be used by Algolux to help promote and grow the company’s computer vision and image optimization technologies with vehicle makers so they can use them with their future vehicles, according to the company. Algolux will also use the money to expand its engineering and marketing teams, while also exploring additional vertical markets for its technologies. The latest funding round was announced by Algolux on July 12 (Monday).
The company’s computer vision software is used with in-vehicle cameras as part of ADAS and autonomous vehicles in a market that is continuing to grow in use and popularity.
Image courtesy: Algolux
“Unfortunately, vision – the most widely deployed component of the overall perception stack – is still hampered by performance issues in low light and poor weather conditions making SAE Levels 2 and above more challenging to support,” the company said in its press release.
To battle this problem, Algolux uses computational imaging to design algorithms that treat the camera as part of the overall perception stack, which is a departure from the traditional siloed approach, according to the company. This approach resolves problems such as low light, low contrast and obstructions for object detection, imaging and geometric estimation, which provides clearer images and resolution. The use of the physical camera models also reduces training data needs by an order of magnitude, resulting in Algolux technologies outperforming commercial solutions by as much as 60 points in mean average precision (mAP), according to the company.
“We are thrilled to be taking this next step in the company’s trajectory and to do so with the trust and support of outstanding investors,” Allan Benchetrit, the CEO of Algolux, said in a statement. “Algolux is actively engaged with leading OEMs, Tier 1s, and Tier 2s globally. The consistent theme is a desire from customers to significantly improve the performance of their driving and parking vision systems in even the most challenging real-world situations.”
Shelly Kramer, analyst
Shelly Kramer, a founding partner and lead analyst with Futurum Research, told EnterpriseAI that Algolux’s latest funding news is an indicator of just how important computer vision is and how it will continue to move forward in the automotive sector.
“The fact that camera-based advanced driver assistance systems are table stakes when it comes to driving experiences today – both driver-led and autonomous – combined with the fact that camera tech still has a long way to go in terms of functionality and accuracy, means this is good news for the industry,” said Kramer. “Algolux’s computational imaging as part of the algorithm design process bodes well for all those days when my car’s camera tells me it can’t see because of weather conditions — and for the computer vision industry and the automotive industries. This is especially good news for the trucking industry and autonomous vehicles. This is an industry, and a company, to watch.”
James Kobielus, senior research director for data communications and management at TDWI, a data analytics consultancy, said the computer vision market today is “extraordinarily overcrowded” and that its use for automotive safety still has a long way to go before it is ready for primetime deployment.
James Kobielus, analyst
“I am impressed with Algolux’s focus on AI-powered cameras for robust perception in all conditions,” said Kobielus. “It approaches visual imaging as an integral, but not self-sufficient, component of the automotive perception stack. Without supplementary sensing inputs–such as radar, LiDAR, infrared, and ultrasound—and the composite AI to tie it all together in real time, automotive computer vision systems are extremely prone to mistakes from ever-present visual phenomena, such as low lighting, low contrast, and obstructed sightlines.”
The larger trend in the marketplace is the deployment of AI-driven perception stacks in which computer vision is essentially the sum of all sensor inputs that can be rendered as visual patterns, said Kobielus.
“Through sophisticated AI, it is increasingly possible to infer a highly accurate visual portrait from the radio frequency signals that people and objects reflect, the pressure and vibrations they generate, and the heat patterns that they radiate,” he said. “Algolux will need the funding to invest in the R&D necessary to improve its composite AI and to work with industry partners to build it into the ASICs necessary for ADAS safety applications.”
ADAS,AI,Algolux,artificial intelligence,autonomous driving,autonomous vehicles,computer vision,GM Ventures,IT investment,Machine Learning,self-driving cars,startup,venture capital
A little bird told me that Tesla soon plans to hold an Artificial Intelligence Day. Or was it a drone? Actually, neither…it was Elon Musk himself. And of course, he didn’t call me personally, instead sharing the welcome news on Twitter.
In a post earlier this week, the electric vehicle & clean energy company’s CEO said the event will likely take place “in about a month or so” and will “go over progress with Tesla AI software & hardware, both training & inference”. He closed with a few words bound to get a few engines revving: “purpose is recruiting”. Ooh, interesting!Read More
What is the technology stack you need to create fully autonomous vehicles? Companies and researchers are divided on the answer to that question. Approaches to autonomous driving range from just cameras and computer vision to a combination of computer vision and advanced sensors. Tesla has been a vocal champion for the pure vision-based approach to autonomous driving, and in this year’s Conference on Computer Vision and Pattern Recognition (CVPR), its chief AI scientist Andrej Karpathy explained why.Read More
A lot of the autonomous tech will also end up in human-driven cars too. This might seem surprising. Many assume that the tech devised to aid AI-based autonomous driving would solely be used by autonomous driving vehicles. But it turns out that there are numerous handy ways in which the tech derived for self-driving cars can be leveraged for human-driven vehicles.Read More
Blurring techniques can potentially offer a level of privacy for those that might be captured on an image or a video.Read More
One approach to achieving a self-driving or driverless car would be to make a robot that could drive a conventional human-driven car in place of the human. Thus, rather than building the driving capabilities directly into a car, the robot would be able to get into a normal car and drive. This kind of exploratory activity is taking place in some research labs, but generally is a long way afield of being practical.Read More
Minecraft seems to be everywhere. Now over a decade old, the video game has reportedly attained more than 126 million monthly active players and has sold well beyond 200 million copies of the gaming software. Besides the game itself, there is plenty of merchandise to be had and lots of spin-offs […]Read More
Tesla’s goal to release its level 5 Full Self Driving (FSD) mode autopilot capability in 2021 was deemed unrealistic by the CEO of competitor Waymo in a recent interview. Tesla is the only autonomous vehicle manufacturer using real-time cameras, rather than pre-mapped Lidar (Light Detection and Ranging) to guide vehicle movement. Tesla also…Read More
Perhaps one of the most well-known facets about robots is the legendary set of three rules proffered by writer Isaac Asimov. His science fiction tale entitled The Three Laws was published in 1942 and has seemingly been unstoppable in terms of ongoing interest and embrace.Read More
We misclassify a lot of things, all the time, daily, and at any moment. You are waiting in a restaurant for a friend to come and have lunch with you. Your eyes are scanning the people that are entering the busy eatery. Assume that it is a cold day and raining or snowing, which means that most of those coming into the restaurant are wearing heavy clothes and generally covered up. It would be quite easy to spot someone that appeared to be your friend, based perhaps on their height and overall shape, yet once they removed their coat and hat, presumably by now seeing clearly the face of the person, you would realize it is not the person you were waiting for.Read More
Are you familiar with the famous twin’s paradox that was proffered by Einstein? It’s quite a hoot. The topic focuses on the foundational nature of time and clocks. Einstein brought up the topic while conceiving the theories of relativity, though historians point out that the thought experiment can be traced to a 1911 paper from scientist Paul Langevin. In any case, let’s jump into the details and put aside the historical origins.Read More
After weeks of confusion and contradictory news, it has almost become confirmed that Hyundai-Kia will start manufacturing Apple’s mysterious self-driving electric vehicle at one of its factories in 2024. Or later. Or not at all.
As is with nearly all news coming from Apple, the report is shrouded in secrecy, unconfirmed facts, and lots of quotes from anonymous sources. But this latest news does have some key points that, if true, paint a clearer picture about Apple’s self-driving plans — and leave many more questions unanswered about the future of the company.
Right now, we assume that time can only flow in one direction, namely forward. Mankind has dreamt forever that it would be nifty if time could be reversed. There are a plethora of science fiction tales including books, short stories, movies, TV shows, poetry, and you name it that have sought to explore what could happen and what might be done if time could flow in reverse.Read More
The number of ways to transport illegal drugs seems to be nearly endless. We all have heard about the use of airplanes to smuggle in illicit drugs. There are also tales aplenty about motorboats and sailboats loaded with banned narcotics that try to reach land.Read More
We all seem to know what a red stop button or kill switch does. Whenever you believe that a contraption is going haywire, you merely reach for the red stop button or kill switch and shut the erratic gadgetry down. This urgent knockout can be implemented via a bright red button that is pushed, or by using an actual pull-here switch, or a shutdown knob, a shutoff lever, etc.Read More