Autonomy Again

UKtramp:

Captain Caveman 76:
Self-driving shuttle bus in crash on first day - BBC News

Would a human driver have avoided the collision? As said above, people are very good at predicting the unpredictable actions of others. Computers aren’t.

Plus, a human driver would have tried to alert the oncoming vehicle, or moved to one side. Automation and people don’t mix well.

This was not the fault of the driverless vehicle, it did what it was supposed to do which was stop. The lorry didn’t stop and so hit it, If there was a human driver sat in the driverless vehicle the outcome would more than likely have been the same outcome. You are assuming a lot with your answer of what a human driver would have probably done. You say automation and people don’t mix well, once again this is your own level of understanding only. You are not versed enough to understand technology to make these statements, it is purely your opinion and obviously not your knowledge as I work in a different category of automation and it works perfectly well. You cannot simply make a statement of this from an incident that was a human error and simply blame the driverless vehicle. As for your stating of people are good at predicting the unpredictable actions of others, computers aren’t, once again you are totally wrong here, the most advanced methods of predicting human catastrophes and behavior in fires in buildings etc is done by computers. You are way behind the times with your knowledge of computers and technology.

But humans are good at predicting such behaviour - including violations of rules, or ambiguous situations. They can, often, tell whether a driver is looking at them, and how his road behaviour implies his future actions, and then compensate for it.

Drivers can also use previous experience of particular junctions to understand how other drivers misinterpret the situation. There is a mini-roundabout near me where drivers regularly pull out in front of the person with right of way - I’ve been the victim of it dozens of times over the years - most drivers apologise and I just grimace at them, because I’d already anticipated it.

I’ve also used it as an example for many years, and the kicker is that I’ve done it myself at least twice (pulling out in front of other drivers) after it had become something that I used to quote to others as an example of a junction with an inherent problem which, even to this day, I cannot quite put my finger on.

UKtramp:

the nodding donkey:
But you keep telling us, that computers are beter at driving than humans. And unless you now believe yourself to be extraterrestrial aswell, that includes you.

You can’t have this automation malarkey both ways you know.

I don’t keep saying computers are better at driving than humans? I am saying that the technology is available, I wouldn’t want to sit in a driver less vehicle myself. I have faith in the technology and I believe it will come I have no doubt about that. Computers can do lots of things better than humans but humans can do better things than computers in respect to feelings, compassion etc.

What people are better at is programming computers, whereas computers have no capability to program themselves or to set the rules for people.

The rules which computers overtly enforce, are merely the decisions of the computer programmers about which rules should be applied.

I constantly get the feeling - that is to say, I infer - that you don’t realise how dependent computers are on people. The role of a computer programmer is to define the computers behaviour, the same as the mechanical engineer defines the behaviour of an engine. Computers are merely complex engines - that is, they employ engine principles, but in complex ways. Everything a computer does can theoretically be expressed as a series of instructions on paper, whereas what humans do is not the implementation of a series of learned instructions (although humans can indeed use that mode of operation for some tasks). The brain and body fundamentally doesn’t work the same way as a digital computer.

Rjan:
But humans are good at predicting such behaviour - including violations of rules, or ambiguous situations. They can, often, tell whether a driver is looking at them, and how his road behaviour implies his future actions, and then compensate for it.

Drivers can also use previous experience of particular junctions to understand how other drivers misinterpret the situation. There is a mini-roundabout near me where drivers regularly pull out in front of the person with right of way - I’ve been the victim of it dozens of times over the years - most drivers apologise and I just grimace at them, because I’d already anticipated it.
.

So you are saying on the one hand that drivers are good at predicting bad driving behavior but then they keep pulling out on you at a junction? What you mean is that some drivers but certainly not all drivers are capable of this. Some drivers quite clearly cannot predict these things or certain accidents would not happen in the first place. Human error is greater than computer error in this respect, although here is a scenario I can think of, what happens in the Mexican standoff on a roundabout with autonomous vehicles, who is the first to move when all vehicles are giving way to the right? As humans we slowly pull out or stop until one vehicle has cleared the roundabout for normal rules to begin again.

UKtramp:

Captain Caveman 76:
Self-driving shuttle bus in crash on first day - BBC News

Would a human driver have avoided the collision? As said above, people are very good at predicting the unpredictable actions of others. Computers aren’t.

Plus, a human driver would have tried to alert the oncoming vehicle, or moved to one side. Automation and people don’t mix well.

This was not the fault of the driverless vehicle, it did what it was supposed to do which was stop. The lorry didn’t stop and so hit it, If there was a human driver sat in the driverless vehicle the outcome would more than likely have been the same outcome. You are assuming a lot with your answer of what a human driver would have probably done. You say automation and people don’t mix well, once again this is your own level of understanding only. You are not versed enough to understand technology to make these statements, it is purely your opinion and obviously not your knowledge as I work in a different category of automation and it works perfectly well. You cannot simply make a statement of this from an incident that was a human error and simply blame the driverless vehicle. As for your stating of people are good at predicting the unpredictable actions of others, computers aren’t, once again you are totally wrong here, the most advanced methods of predicting human catastrophes and behavior in fires in buildings etc is done by computers. You are way behind the times with your knowledge of computers and technology.

When I started driving, I was always told: it doesn’t matter how good a driver you are, you’ve always got to watch out for the ■■■■■■■ My OPINION on driverless vehicles has always been: it doesn’t matter how good they are because there are too many ■■■■■■■ on the road. Within two hours of a driverless vehicle joining the road, it’s hit by a ■■■■■■■ Looks like my opinion wasn’t too far off the mark.

Hmm, could a no nothing scumbag like me be right when all these experts are wrong? Nah, that’ll never happen. :laughing:

It’s only a matter of time. Although I think more should be invested in public transport because the amount of cars with the combination of more HGVs is not sustainable.

If all vehicles had to be fully automated to be on the road in say 50 years therfore taking human error out of the equation would we still have to pay insurance? Fully automated roads is the only way to allow to have trucks without drivers.

Captain Caveman 76:
When I started driving, I was always told: it doesn’t matter how good a driver you are, you’ve always got to watch out for the [zb]. My OPINION on driverless vehicles has always been: it doesn’t matter how good they are because there are too many [zb] on the road. Within two hours of a driverless vehicle joining the road, it’s hit by a [zb]. Looks like my opinion wasn’t too far off the mark.

Hmm, could a no nothing scumbag like me be right when all these experts are wrong? Nah, that’ll never happen. :laughing:

Why do you always turn a debate into a slanging match? It seems that you are always on edge if anyone disagrees with your statements. The statements I make are based on my own experiences but I do not mind when someone disagrees with me. I don’t call myself a no mark scumbag or call people. I like debating with people who can put up an argument or have a differing opinion as that is interesting. I like to hear other peoples opinions about topics. Whenever I try to debate a subject with you, you turn nasty and go off on one. Can you not accept other people have differing opinions and that yours is not always the correct one ?

adam277:
It’s only a matter of time. Although I think more should be invested in public transport because the amount of cars with the combination of more HGVs is not sustainable.

+1 with that statement.

UKtramp:

Captain Caveman 76:
When I started driving, I was always told: it doesn’t matter how good a driver you are, you’ve always got to watch out for the [zb]. My OPINION on driverless vehicles has always been: it doesn’t matter how good they are because there are too many [zb] on the road. Within two hours of a driverless vehicle joining the road, it’s hit by a [zb]. Looks like my opinion wasn’t too far off the mark.

Hmm, could a no nothing scumbag like me be right when all these experts are wrong? Nah, that’ll never happen. :laughing:

Why do you always turn a debate into a slanging match? It seems that you are always on edge if anyone disagrees with your statements. The statements I make are based on my own experiences but I do not mind when someone disagrees with me. I don’t call myself a no mark scumbag or call people. I like debating with people who can put up an argument or have a differing opinion as that is interesting. I like to hear other peoples opinions about topics. Whenever I try to debate a subject with you, you turn nasty and go off on one. Can you not accept other people have differing opinions and that yours is not always the correct one ?

There’s so much irony on that post it has to be deliberate.

Anyway, just focus on the content of my post, not the tone. Then get back to me.

Yep, things are going great for autonomy.

neilg14:
0

Yep, things are going great for autonomy.

[emoji23] although it was a hooman been that hit the bus! Can an autonomous bus jump out and punch someone?

Sent from my iPhone using Tapatalk

Carryfast:

Freight Dog:

the maoster:
^^^^^ just wot I sed :wink: , a tad longer and far more concise to be sure, but it still boils down to the same thing. :smiley:

Eye you did. Nice one :smiley:

But Dr Damon will never believe it. :smiling_imp: :laughing:

viewtopic.php?f=2&t=150433&hilit=+auto+land#p2383494

Your points are pretty spot on in that link :smiley:

A quick look through the comments does show that this wild notion is incredibly pervasive. A routine flight is a very complex manually dealt with situation across its entire length with some automatics used as tools for various sub sections. An autopilot can no more manage the whole flight other than ask the pilot for a take off and landing anymore than an endoscope can prep, perform and close up after stomach surgery.

There’s a huge amount of of events on aircraft that require autopilots to be disengaged, or will cause the auto pilot to disengage for safety reasons. It’s quite the opposite of what people think in that link you provided.

Automatics are an extremely valuable tool. But they are just that, a tool to be used appropriately and they have very definite limits.

UKtramp:

Bluey Circles:
here we go, just happened, autonomous bus let loose onto the streets of Las Vegas - within two hours a reefer reverses into it
youtube.com/watch?v=u7pV4vxD1bs

So the shuttle stopped as it should have done, the truck backed into it, this happens daily, no news here and certainly not a failure on the autonomous vehicle which is what they said in the report. There was also a driver on board the autonomous vehicle so how is this at fault?

If a cyclist had been coming along that road and had stopped and waited in that position when the artic was reversing we would be declaring him has utterly brainless. That truck had nearly completed its reverse so must have been across the road for several minutes, it must have been obvious to any other road user where the unit would be going but the autonomous bus could not work it out.

UKtramp:

Rjan:
But humans are good at predicting such behaviour - including violations of rules, or ambiguous situations. They can, often, tell whether a driver is looking at them, and how his road behaviour implies his future actions, and then compensate for it.

Drivers can also use previous experience of particular junctions to understand how other drivers misinterpret the situation. There is a mini-roundabout near me where drivers regularly pull out in front of the person with right of way - I’ve been the victim of it dozens of times over the years - most drivers apologise and I just grimace at them, because I’d already anticipated it.
.

So you are saying on the one hand that drivers are good at predicting bad driving behavior but then they keep pulling out on you at a junction?

Yes, they are pulling out (due to deficits in the design of the junction) in circumstances where I anticipate that they will do so.

What you mean is that some drivers but certainly not all drivers are capable of this. Some drivers quite clearly cannot predict these things or certain accidents would not happen in the first place. Human error is greater than computer error in this respect, although here is a scenario I can think of, what happens in the Mexican standoff on a roundabout with autonomous vehicles, who is the first to move when all vehicles are giving way to the right? As humans we slowly pull out or stop until one vehicle has cleared the roundabout for normal rules to begin again.

Human error is really not as bad as you think it is, bearing in mind the extent to which the environment is uncontrolled, and the tight margins of error on which humans generally choose to operate (in order to capture various efficiencies). Computers are only ever seen to outdo humans, because in situations where they don’t even begin to outdo humans (particularly on factors like ensuring human safety), they are rarely employed.

The current crop of experimental autonomous cars are a prime example of computers that don’t outdo humans - they are being run as research projects, and their inferior systems will be redesigned and rewritten many times in an attempt to improve them from the low standard they will initially set. What is being researched is whether, eventually, analysts can model the problem sufficiently well so that a computer can be programmed to cope with driving on existing infrastructure most of the time. If that threshold is reached, billions of pounds will be spent bulldozing and remaking those parts of the infrastructure that drivers can now handle adequately but which it is inefficient or intractable to model a computerised response.

A good example is emergency service response - it will almost certainly prove impossible to model acceptable situations in which the car can mount a pavement to allow a vehicle to pass, so a different approach will be taken such as setting traffic lights to empty the desired route ahead of the emergency response vehicle, or taking remote control of autonomous vehicles to force them into surrounding side roads and alternative routes. In the initial stages, it will just be left to the driver to take back control of the car and mount the pavement as they see fit.

The greatest asset that any driver has is anticipation. If you can anticipate what will happen next you are half way to avoiding a collision. Anticipation comes with experience. Computers can’t do it.

onesock:
The greatest asset that any driver has is anticipation. If you can anticipate what will happen next you are half way to avoiding a collision. Anticipation comes with experience. Computers can’t do it.

Anticipation is just a complex algorithm, computers excel with algorithms.

UKtramp:

onesock:
The greatest asset that any driver has is anticipation. If you can anticipate what will happen next you are half way to avoiding a collision. Anticipation comes with experience. Computers can’t do it.

Anticipation is just a complex algorithm, computers excel with algorithms.

Oh the naivety!

I could make all sorts of observations, but I will make this one: computers excel at executing algorithms for which we have already devised an expression on paper. But even if “driver anticipation” is expressable algorithmically, actually expressing that algorithm is well beyond the current state-of-the-art. Humans find it much easier to acquire and apply this “algorithm” through life experience, than to actually state it as a pencil-and-paper process, and for so long as the algorithm cannot be expressed in that way, it cannot be computerised.

Rjan:
Oh the naivety!

I could make all sorts of observations, but I will make this one: computers excel at executing algorithms for which we have already devised an expression on paper. But even if “driver anticipation” is expressable algorithmically, actually expressing that algorithm is well beyond the current state-of-the-art. Humans find it much easier to acquire and apply this “algorithm” through life experience, than to actually state it as a pencil-and-paper process, and for so long as the algorithm cannot be expressed in that way, it cannot be computerised.

Rjan you are missing the point that I am making as it is difficult to get across by writing alone. I am not suggesting that a computer can think for itself, as I have a degree in computer science I fully understand this concept well. I am suggesting once a computer has the algorithm, it will execute the algorithm flawlessly time and time again. The problem with a human brain in comparison is the human becomes complacent with situations where the computer will not. On the one hand an intelligent brain is superior to the computer in this sense but someone with a lower intelligence level will become dangerously complacent and make errors in his judgement. A really simplified example to try and get my point across better is this. A stop sign and double white lines means you have to stop. Regardless of whether or not it is deemed safe to stop or not, you have to stop, now a computer will do this every single time, regardless of safety, it will stop as that is the algorithm it has. A human will slow down and stop, after you get used to this junction or whatever it is and travel this route for some time, you become complacent, you then slow down and not stop, it then becomes to the point of perhaps not even slowing down to exit the junction as your brain has learnt that it is safe to carry on from previous experience. After this has happened on several occasions the chances of then colliding with another vehicle increases as you have not stopped and not had the time to look or react. A simple example I know but hopefully gets my point across that a computer will execute the algorithm over and over without fail where a human will not. Software gets altered as bugs are found and re programmed to perfect it in a similar fashion to a driver learning. Yes a programmer has to program the computer to begin with, the point is the computer will execute that program regardless. I am not suggesting the computer is a brain on its own, I fully understand it doesn’t program itself.

What many seem to be missing here is that we are on the threshold of A.I.

The new generations of machines will not need increasingly complex algorithms.

Look at autopilots.
There will not be autopilots with very clever and complex (but still limited) programmes.
The machines (both on board, and in the cloud) will intercommunicate and learn from each other. A future autopilot will learn as it flies and observes pilots flying in thousands of situations. It will have info from every landing made. It will have far more experience than any pilot can have. It will learn from successes and failures. We`re talking about A.I. here, NOT pre-programmed computers.

Drivers do anticipate what other drivers do. True. When machines are ubiquitous we wont need anticipation. Theyll all know where each other is going. Knowing a vehicle is intending to move into an entrance means not stopping in front of it.
No need for traffic lights, vehicles can approach a junction and all adjust speed to flow and merge. No egos getting in the way here!
Knowing a vehicle is about to enter a narrow road from the east a vehicle in the west would wait. No need for line of sight, all will have GPS of course.

Not tomorrow, but many of us will see it in our life times.

Isnt this what Stephen Hawking and others are talking about? Isnt this where we`re going. Not just simple machines designed to follow yellow lines painted on roads?

i reckon the big problem will be de to the fact that the uk,eire and a cpl other countries all drive on the oposite side of the road to everywhere else.
why dont they just make all the trucks and buses drive on the left,then after about 5 years if alls well,make all the cars drive on the left as well.
that might work?

dieseldog999:
i reckon the big problem will be de to the fact that the uk,eire and a cpl other countries all drive on the oposite side of the road to everywhere else.
why dont they just make all the trucks and buses drive on the left,then after about 5 years if alls well,make all the cars drive on the left as well.
that might work?

Yep, dont wanna put all our eggs in one basket do we? Thatd be silly. Best bring new changes in gradually. Always struck me that boats can be dangerous if they sink at sea. If these big boats stuck much closer to coast there wouldn`t be so far for people to swim ashore?