380
Ford new drones replicate human driving
Ford is gradually coming to the forefront in the production of unmanned vehicles especially in the hardware development. Unlike Google and other IT companies that are experimenting with drones, Ford can afford low-level integration technologies, and these models may well be on the market approximately in the same form in which they are now testing.
Now the company introduced the second generation of unmanned vehicles Fusion with a hybrid powertrain. Suspension they have is the same as standard Fusion Hybrid, but part of the computer equipment, these machine will give odds to anyone. Equipment Packed the entire trunk.
The first unmanned Ford Fusion HybridFirst unmanned version of the Ford Fusion Hybrid came to light three years ago, in December 2013. It was a pretty unusual machine with the "horns" of the four lidars on the roof.
The debut of the company in the technologies of Abakarov took place through the technical assistance of experts from the University of Michigan and with the participation of insurance companies; State Farm Insurance. I wonder what the insurers immediately "fit" in this initative, which in the future can bury their business third party liability insurance (apparently, they themselves don't think so). As for the University of Michigan, the Ford engineers created an unmanned transport back in 2004-2007, when the company participated in a race of unmanned vehicles organized by DARPA.
The first model was experimental. In addition to long-term tasks, she has performed and short-term goals — the creation of handy features aid the driver. They can be implemented in a commercial model, along with existing technologies such as adaptive cruise control, automatic Parking, responding to the blind area (Blind Spot Information System in Ford cars), warnings about leaving his lane, warning about the danger of a collision and help with braking. All this is implemented in production models.
For the experimental model of the autopilot wrote special software and set the car's computer, which scans the surrounding area using four lidar (scanning radius 60 m). Supplementing information from other sensors, software, the University of Michigan is dynamically a 3D model of the surrounding area
A new generation
The new unmanned generation Ford Fusion Hybrid, the engineers got rid of two of the four lidars, but the remaining lidars have much better specifications, including a wider field of vision. In addition, they have a clever design, so that the lidar is no longer sticking out like horns on the roof. Two 360-degree lidar in the form of washers have an increased range compared to previous model: about 200 m in each direction. On the roof also installed two arcs with three cameras. Another camera below the windshield. Additional sensors near and far ranges provide computer vision in rain, fog and snowfall.
The developers have also significantly updated the car's computer and software. New equipment and sensors generate 1 terabyte of information per hour, it is necessary to quickly process. Moreover, such a computer system consumes a lot of energy — that is why Autonomous cars will never run on petrol. Need at least a new hybrid engine, which can stably supply electricity to the computer.
Chief programme engineer of unmanned vehicles Ford Chris Brewer (Chris Brewer) writes that the electrical controls of the robotic machine in its current form is already close to mass production. According to him, the car combines two technologies: an Autonomous platform and system virtual driver. New model Ford intended for additional development and testing, primarily system of the virtual driver, which requires processing of large amount of data from the sensors and large computing resources.
Ford believes that their vehicles will meet the 4th level of automation of the vehicle according to the SAE standard (international standard J3016). This level provides for a "dynamic autopilot", which is able to react to the traffic situation even if the person is the driver does not react to the signal and takes control, as expected. The task is to copy everything that makes driving a driver person. That is, the autopilot tries intellectually to replicate human driving — this is the "virtual driver" (virtual driver system).
The virtual driver includes sensors (lidar, camera and radar), localization algorithms and strip path, a system of computer vision and machine learning, highly detailed 3D maps and high-performance computer.
The autopilot uses data from two systems: 1) pre-prepared 3D maps; 2) dynamic information from the sensors to check a map and for detection of objects that are not on the map — pedestrians, cyclists and other cars. Computer vision system even able to detect signals of the traffic controller.
Automotive computer performs three tasks: processing of new information from sensors, decision making and vehicle control.
In theory, combining information from the two systems will allow the autopilot to cope with the driving the same as, or even better than a man, says chief engineer Chris Brewer.
Simultaneously with the release of the new Ford expands from 30 to 90 Grand fleet of unmanned vehicles currently undergoing testing on public roads in Arizona, Michigan and California.
Ford is not abandoning its promise to release a fully Autonomous vehicle by the year 2021 for a taxi. Engineer Ford added that in the end this car will be produced without a steering wheel and pedals.
To unmanned taxi all liked it, specialists are now working including over challenging situations: what to do about the drone, if the passenger left the bag on the seat, what if he didn't lock the door and something in it stuck, etc. "the Future is coming. And we can't wait," says Chris Brewer. published
Source: geektimes.ru/post/284192/
Now the company introduced the second generation of unmanned vehicles Fusion with a hybrid powertrain. Suspension they have is the same as standard Fusion Hybrid, but part of the computer equipment, these machine will give odds to anyone. Equipment Packed the entire trunk.
The first unmanned Ford Fusion HybridFirst unmanned version of the Ford Fusion Hybrid came to light three years ago, in December 2013. It was a pretty unusual machine with the "horns" of the four lidars on the roof.
The debut of the company in the technologies of Abakarov took place through the technical assistance of experts from the University of Michigan and with the participation of insurance companies; State Farm Insurance. I wonder what the insurers immediately "fit" in this initative, which in the future can bury their business third party liability insurance (apparently, they themselves don't think so). As for the University of Michigan, the Ford engineers created an unmanned transport back in 2004-2007, when the company participated in a race of unmanned vehicles organized by DARPA.
The first model was experimental. In addition to long-term tasks, she has performed and short-term goals — the creation of handy features aid the driver. They can be implemented in a commercial model, along with existing technologies such as adaptive cruise control, automatic Parking, responding to the blind area (Blind Spot Information System in Ford cars), warnings about leaving his lane, warning about the danger of a collision and help with braking. All this is implemented in production models.
For the experimental model of the autopilot wrote special software and set the car's computer, which scans the surrounding area using four lidar (scanning radius 60 m). Supplementing information from other sensors, software, the University of Michigan is dynamically a 3D model of the surrounding area
A new generation
The new unmanned generation Ford Fusion Hybrid, the engineers got rid of two of the four lidars, but the remaining lidars have much better specifications, including a wider field of vision. In addition, they have a clever design, so that the lidar is no longer sticking out like horns on the roof. Two 360-degree lidar in the form of washers have an increased range compared to previous model: about 200 m in each direction. On the roof also installed two arcs with three cameras. Another camera below the windshield. Additional sensors near and far ranges provide computer vision in rain, fog and snowfall.
The developers have also significantly updated the car's computer and software. New equipment and sensors generate 1 terabyte of information per hour, it is necessary to quickly process. Moreover, such a computer system consumes a lot of energy — that is why Autonomous cars will never run on petrol. Need at least a new hybrid engine, which can stably supply electricity to the computer.
Chief programme engineer of unmanned vehicles Ford Chris Brewer (Chris Brewer) writes that the electrical controls of the robotic machine in its current form is already close to mass production. According to him, the car combines two technologies: an Autonomous platform and system virtual driver. New model Ford intended for additional development and testing, primarily system of the virtual driver, which requires processing of large amount of data from the sensors and large computing resources.
Ford believes that their vehicles will meet the 4th level of automation of the vehicle according to the SAE standard (international standard J3016). This level provides for a "dynamic autopilot", which is able to react to the traffic situation even if the person is the driver does not react to the signal and takes control, as expected. The task is to copy everything that makes driving a driver person. That is, the autopilot tries intellectually to replicate human driving — this is the "virtual driver" (virtual driver system).
The virtual driver includes sensors (lidar, camera and radar), localization algorithms and strip path, a system of computer vision and machine learning, highly detailed 3D maps and high-performance computer.
The autopilot uses data from two systems: 1) pre-prepared 3D maps; 2) dynamic information from the sensors to check a map and for detection of objects that are not on the map — pedestrians, cyclists and other cars. Computer vision system even able to detect signals of the traffic controller.
Automotive computer performs three tasks: processing of new information from sensors, decision making and vehicle control.
In theory, combining information from the two systems will allow the autopilot to cope with the driving the same as, or even better than a man, says chief engineer Chris Brewer.
Simultaneously with the release of the new Ford expands from 30 to 90 Grand fleet of unmanned vehicles currently undergoing testing on public roads in Arizona, Michigan and California.
Ford is not abandoning its promise to release a fully Autonomous vehicle by the year 2021 for a taxi. Engineer Ford added that in the end this car will be produced without a steering wheel and pedals.
To unmanned taxi all liked it, specialists are now working including over challenging situations: what to do about the drone, if the passenger left the bag on the seat, what if he didn't lock the door and something in it stuck, etc. "the Future is coming. And we can't wait," says Chris Brewer. published
Source: geektimes.ru/post/284192/
Ecodomic, from which you can observe the northern lights
Programmable biomaterials based on silk were created