Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(CNBC)   Jack Dorsey warns that AI will replace software engineers, unclear on whether this will improve coding   (cnbc.com) divider line
    More: Interesting, Artificial intelligence, Computer, Income, Software engineering, Academic degree, Brookings Institution, Jack Dorsey, Computer science  
•       •       •

723 clicks; posted to Business » and Geek » on 24 May 2020 at 12:35 PM (6 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



125 Comments     (+0 »)
 
View Voting Results: Smartest and Funniest


Oldest | « | 1 | 2 | 3 | » | Newest | Show all

 
2020-05-24 9:33:44 AM  
Software won't be relevant any more?  That work is getting farmed out to the low wage countries, Jack.  And you know it.
 
2020-05-24 10:18:26 AM  
I remember reading that same argument in an issue of Compute! in the early 1980s.  Pretty soon computers would write their own programs with program-writing programs, and programmers will be obsolete.  Just as soon as Apple declares bankruptcy and everyone switches to Dvorak keyboards.

I think this ignores how shallow AI really is.  AI structures like CNNs and deep learning systems are very low-level feature-processing systems, and they have to be fit to solve a larger problem like a gear has to be fit into a clock.  They do not start working on their own, and they do not solve general problems.

Most notably, they are terrible at exactly the one thing programmers have to do:  guarantee correctness for all corner cases that do not appear in the typical inputs and outputs.
 
2020-05-24 10:25:28 AM  

Xcott: I remember reading that same argument in an issue of Compute! in the early 1980s. Pretty soon computers would write their own programs with program-writing programs, and programmers will be obsolete


My COBOL professor told me in 1980 that programming would simply die out.  Machines would do it all.

It doesn't look that way to me.
 
2020-05-24 10:25:43 AM  

Xcott: I remember reading that same argument in an issue of Compute! in the early 1980s.  Pretty soon computers would write their own programs with program-writing programs, and programmers will be obsolete.  Just as soon as Apple declares bankruptcy and everyone switches to Dvorak keyboards.

I think this ignores how shallow AI really is.  AI structures like CNNs and deep learning systems are very low-level feature-processing systems, and they have to be fit to solve a larger problem like a gear has to be fit into a clock.  They do not start working on their own, and they do not solve general problems.

Most notably, they are terrible at exactly the one thing programmers have to do:  guarantee correctness for all corner cases that do not appear in the typical inputs and outputs.


AI is a lie. Expert system or machine learning is more appropriate.

AI is the term you use when you want media attention.
 
2020-05-24 10:29:30 AM  
Software is stupid, and so is Jack Dorsey.

I have been a programmer for over 40 years.  I've heard about Fifth Generation Languages, intelligent compilers, learning systems, and all kinds of 'replace you' technologies for most of that time.  How much has come to fruition?  Very little.

Computers do *exactly* what they're told to do, and *nothing* else.  They do not see side-effects that could turn into beneficial discoveries -- they see errors and data that must be discarded or repaired.

Jackie boy is talking about automatic code generators displacing entry level workers.  If the CEO of a high-tech startup can't see his way past that, he should go ahead and retire now.  Automatic code generators will produce high-quality, well-documented code that does exactly what the specs said.  Know where that leaves the average programmer?  Happy.

I spend the vast majority of my time finding simple errors in deep code.  An inverted IF statement, for example, or a + that should have been a -.  Sometimes it's wrong in the spec, but most of the time it's a simple misunderstanding or just a plain old brain fart.  Irrespective the cause, these things take a great deal of time to find, and I'm not cheap.

Automatic code generators will, in all probability, significantly reduce the amount of time I and others like me spend doing this sort of debugging.  Whatever will I do with all my free time?  Might I spend it doing things the computer can't do -- things that require imagination and the ability to think abstractly?  The code is not the Program any more than the nails and wood are the home.

It means the OS and language wars are finally over.  It means that colleges and universities can stop teaching useless things like language and OS syntax, and focus on logic and algorithmic development.  It is the evolution of the programmer from code monkey to Engineer, and it is time.

If Jack Dorsey can't see this, he doesn't deserve his position.  CEOs, especially tech CEOs, must have vision, and statements such as these demonstrate his vision is severely limited.
 
2020-05-24 10:37:08 AM  

unixpro: I spend the vast majority of my time finding simple errors in deep code


Programming has evolved into a two step process: choosing and then assembling the vast array of tools at our disposal, and then figuring out why they don't work.

The second part is always harder.
 
2020-05-24 10:38:16 AM  

Gubbo: AI is a lie. Expert system or machine learning is more appropriate.


And a lot of their impressive results are really the giant dataset rather than some algorithmic magic.

When IBM's Watson won Jeopardy, I noticed that most of the Jeopardy questions could be solved by typing them into Google and parsing through the first few hits.  You could make a decent "Watson" just by taking a snapshot of the Web with Google's indexing of it.  That isn't "artificial intelligence" but rather "big data."

Sometimes there's barely any distinction between AI and big data.  In face identification, for example, we grab some very obvious and shallow face features, and compare them to feature vectors in a database.  In some cases (like with local binary pattern histograms) there is no AI-like "training" at all:  you just turn the images into histograms and dump the histograms into a big directory and choose the best match.
 
2020-05-24 10:54:15 AM  

unixpro: I've heard about Fifth Generation Languages, intelligent compilers, learning systems, and all kinds of 'replace you' technologies for most of that time.  How much has come to fruition?  Very little.


Username checks out.  I point out to the students today that when I learned how to use a *nix shell, the students hadn't been born yet, and when those shell tools were first created I hadn't been born yet, and yet here we are still using this shell.  Then I show them job postings that list bash as one of the must-have skills.

However, I disagree with this sentiment:

It means the OS and language wars are finally over.  It means that colleges and universities can stop teaching useless things like language and OS syntax, and focus on logic and algorithmic development.  It is the evolution of the programmer from code monkey to Engineer, and it is time.

In my opinion, programming is a lot like creative writing:  people suck at it unless they spend a lot of time writing as well as debugging.  We will never separate the logic and problem solving aspect of CS from the actual writing of code, any more than we can separate rehearsal from music performance.

Students do have to understand the tools and processes that are used to manage very large projects, and why and how that scaffolding helps, but that sits atop reading and writing that in turn requires extensive practice.

On a smaller scale, students need that practice to get through college anyway.  They want to work on independent studies and research projects or they have to finish OS or capstone projects and a lot of them boil down to "I need someone to write code to do X."  I frequently advise teams of CS and computer engineering students, and the CS kids focus on algorithms and theory while the CE kids only take programming and data structures in C.  The CE kids produce working software, and the CS kids mostly tell me that this is trivial and have nothing at the end of the semester.
 
2020-05-24 12:14:02 PM  

Gubbo: Xcott: I remember reading that same argument in an issue of Compute! in the early 1980s.  Pretty soon computers would write their own programs with program-writing programs, and programmers will be obsolete.  Just as soon as Apple declares bankruptcy and everyone switches to Dvorak keyboards.

I think this ignores how shallow AI really is.  AI structures like CNNs and deep learning systems are very low-level feature-processing systems, and they have to be fit to solve a larger problem like a gear has to be fit into a clock.  They do not start working on their own, and they do not solve general problems.

Most notably, they are terrible at exactly the one thing programmers have to do:  guarantee correctness for all corner cases that do not appear in the typical inputs and outputs.

AI is a lie. Expert system or machine learning is more appropriate.

AI is the term you use when you want media attention.


And digital machines will never be synthetically conscious. Nothing in nature is digital. Math can't deal with infinity, and is the descriptive language of the universe but is not actually the universe. Oxford quantum physics professor Andrew Steane wrote in his paper about quantum information systems at https://arxiv.org/pdf/quant-ph/970802​2​.pdf :

"The new version of the Church-Turing thesis (now called the 'Church-Turing Principle') does not refer to Turing machines. This is important because there are fundamental differences between the very nature of the Turing machine and the principles of quantum mechanics. One is described in terms of operations on classical bits, the other in terms of evolution of quantum states. Hence there is the possibility that the universal Turing machine, and hence all classical computers, might not be able to simulate some of the behavior to be found in Nature. Conversely, it may be physically possible (i.e. not ruled out by the laws of Nature) to realize a new type of computation essentially different from that of classical computer science. This is the central aim of quantum computing."
 
2020-05-24 12:17:35 PM  
Will the AI also be equally condescending when you ask a simple question on a support forum?
 
2020-05-24 12:17:41 PM  

Xcott: unixpro: I've heard about Fifth Generation Languages, intelligent compilers, learning systems, and all kinds of 'replace you' technologies for most of that time.  How much has come to fruition?  Very little.

Username checks out.  I point out to the students today that when I learned how to use a *nix shell, the students hadn't been born yet, and when those shell tools were first created I hadn't been born yet, and yet here we are still using this shell.  Then I show them job postings that list bash as one of the must-have skills.

However, I disagree with this sentiment:

It means the OS and language wars are finally over.  It means that colleges and universities can stop teaching useless things like language and OS syntax, and focus on logic and algorithmic development.  It is the evolution of the programmer from code monkey to Engineer, and it is time.

In my opinion, programming is a lot like creative writing:  people suck at it unless they spend a lot of time writing as well as debugging.  We will never separate the logic and problem solving aspect of CS from the actual writing of code, any more than we can separate rehearsal from music performance.


Fair enough, but (1) I'm talking about an ideal, not the actual, and (2) we don't require music students to know how to build a piano, or even how to tune one.  As someone who has hired and worked with lots of entry-level and college intern developers, I know that the schools are not teaching useful things like threading and interprocess communication.  I need them to understand what these are and how to use them to solve their problems, but I do not need them to know the 7 layers of TCP/ip to do that.

Every major language supports sockets and threading in one way or another, but the details of the language are unimportant.  Heck, you won't even be using the same language 5 years from now.  Why focus on the syntax rather than the abstract?  Teach them how what threads are for from an architectural point of view.  Teach them how to construct a good messaging protocol, not how to construct a socket().

There will always be a place for the language specialist, just like there is for the firmware developer, but the majority of us would benefit by relegating the minutia to automation.

My first programs were on punch cards, in FORTRAN IV with WATFOR/WATFIV.  Big red and white book with problems most modern CS students would recognize, starting with "Hello, World."  That's the kind of stuff that a code generator can do nicely.  These are nails, or musical notes.  We're teaching how to make a nail or string a piano wire when what we need to be discussing is how to use the nail to build the house, or the note in the symphony.

Your CS kids are right, but they're also wrong.  They're right that it's trivial, but at the moment they have no alternative so it must be done.  Reality is what reality is, and until we get good code generators everyone must learn how to make nails.
 
2020-05-24 12:26:48 PM  

I_Am_Weasel: Will the AI also be equally condescending when you ask a simple question on a support forum?


yes bc AI will also be a sad, socially isolated endomorph.
 
2020-05-24 12:47:44 PM  
an A.I. might be able to write the code for behind the scenes stuff that make a piece of software function but they would be useless where real creativity is required.   Also i suspect more often than not an A.I. written code might be functional but it would leave many places for refinement and improvement. Also Bugs.
 
2020-05-24 12:51:11 PM  
Back when everyone in the US started sheltering in place, their shopping patterns changed dramatically.  That changed jacked up the AI algorithms for Amazon and many others.

Society would be better off if we could put Dorsey, Musk, Bezos, and Zuckerberg into a rocket and launch it into the sun.
 
2020-05-24 1:03:43 PM  

unixpro: Software is stupid, and so is Jack Dorsey.

I have been a programmer for over 40 years.  I've heard about Fifth Generation Languages, intelligent compilers, learning systems, and all kinds of 'replace you' technologies for most of that time.  How much has come to fruition?  Very little.

Computers do *exactly* what they're told to do, and *nothing* else.  They do not see side-effects that could turn into beneficial discoveries -- they see errors and data that must be discarded or repaired.

Jackie boy is talking about automatic code generators displacing entry level workers.  If the CEO of a high-tech startup can't see his way past that, he should go ahead and retire now.  Automatic code generators will produce high-quality, well-documented code that does exactly what the specs said.  Know where that leaves the average programmer?  Happy.

I spend the vast majority of my time finding simple errors in deep code.  An inverted IF statement, for example, or a + that should have been a -.  Sometimes it's wrong in the spec, but most of the time it's a simple misunderstanding or just a plain old brain fart.  Irrespective the cause, these things take a great deal of time to find, and I'm not cheap.

Automatic code generators will, in all probability, significantly reduce the amount of time I and others like me spend doing this sort of debugging.  Whatever will I do with all my free time?  Might I spend it doing things the computer can't do -- things that require imagination and the ability to think abstractly?  The code is not the Program any more than the nails and wood are the home.

It means the OS and language wars are finally over.  It means that colleges and universities can stop teaching useless things like language and OS syntax, and focus on logic and algorithmic development.  It is the evolution of the programmer from code monkey to Engineer, and it is time.

If Jack Dorsey can't see this, he doesn't deserve his position.  CEOs, especially tech CEOs, must have vision, and statements such as these demonstrate his vision is severely limited.


But none of that grabs headlines & publicity, or makes luddites raise their fists & share on Facebook, so is of no interest as a click generating article.

/Folks round there don't take too kindly to "logic" and "reason"
 
2020-05-24 1:21:10 PM  

unixpro: Software is stupid, and so is Jack Dorsey.

...Computers do *exactly* what they're told to do, and *nothing* else.  They do not see side-effects that could turn into beneficial discoveries -- they see errors and data that must be discarded or repaired

...

Years ago when I had begun assembling my own desktops and troubleshooting them, I remember reading something similar on a troubleshooting site. But it went a little further and to paraphrase, "computers only do what the programmers have told them to do. Troubleshooting is easier when you realize that you have to find out how the computer was told to do something by the programmer".

At least, that works for me because when a computer does something stupid, I realize it is the coders who wrote something that (inadvertently) caused the issue. I wish I could find the article which explains it much better than I have.
 
2020-05-24 1:42:26 PM  
200ms sprints?
 
2020-05-24 1:52:52 PM  

skinink: unixpro: Software is stupid, and so is Jack Dorsey.

...Computers do *exactly* what they're told to do, and *nothing* else.  They do not see side-effects that could turn into beneficial discoveries -- they see errors and data that must be discarded or repaired...

Years ago when I had begun assembling my own desktops and troubleshooting them, I remember reading something similar on a troubleshooting site. But it went a little further and to paraphrase, "computers only do what the programmers have told them to do. Troubleshooting is easier when you realize that you have to find out how the computer was told to do something by the programmer".

At least, that works for me because when a computer does something stupid, I realize it is the coders who wrote something that (inadvertently) caused the issue. I wish I could find the article which explains it much better than I have.


737MAX is a good example of the result of software written by people who did not understand how airplanes work and airplanes flown by pilots that didn't understand how the software worked.
 
2020-05-24 1:54:46 PM  
Why do people always forget that computers are only as intelligent as the people who made them?
 
2020-05-24 1:55:43 PM  
I work on both production support and developing updates for an enterprise level billing system for insurance companies.

No machine learning algorithm is ever going to predict, let alone code for, how users will use the tools made available. What's that worn joke about building a foolproof system?

As long as humans are interfacing with the software, you are going to want humans developing it.
 
2020-05-24 2:07:13 PM  

Gubbo: Xcott: I remember reading that same argument in an issue of Compute! in the early 1980s.  Pretty soon computers would write their own programs with program-writing programs, and programmers will be obsolete.  Just as soon as Apple declares bankruptcy and everyone switches to Dvorak keyboards.

I think this ignores how shallow AI really is.  AI structures like CNNs and deep learning systems are very low-level feature-processing systems, and they have to be fit to solve a larger problem like a gear has to be fit into a clock.  They do not start working on their own, and they do not solve general problems.

Most notably, they are terrible at exactly the one thing programmers have to do:  guarantee correctness for all corner cases that do not appear in the typical inputs and outputs.

AI is a lie. Expert system or machine learning is more appropriate.

AI is the term you use when you want media attention.


Fark user imageView Full Size
 
2020-05-24 2:07:38 PM  
India takes my job more than anything else.  They are cheaper and no one cares if the software actually works 100% as long as the visible part of the job is completed.  They want it fast and ready now.  They can't wait 6 months for a development cycle.  Good thing I have a dual degree and can do other things, oh wait, that's being outsourced too.  Back to landscaping, since all the illegal immigrants are being deported.
 
2020-05-24 2:08:25 PM  
images.theconversation.comView Full Size


"I always do that ... I always mess up some mundane detail"
 
2020-05-24 2:10:44 PM  

johnphantom: Math can't deal with infinity,


Um, that's like saying "English doesn't have a word for 'purple'".

Mathematics has many devices for dealing with infinity, developed over centuries, and address infinite concepts very rigorously.  We can "deal with infinity" as well as we can deal with numbers or deal with space.
 
2020-05-24 2:18:59 PM  

unixpro: Software is stupid, and so is Jack Dorsey.

I have been a programmer for over 40 years.  I've heard about Fifth Generation Languages, intelligent compilers, learning systems, and all kinds of 'replace you' technologies for most of that time.  How much has come to fruition?  Very little.

Computers do *exactly* what they're told to do, and *nothing* else.  They do not see side-effects that could turn into beneficial discoveries -- they see errors and data that must be discarded or repaired.

Jackie boy is talking about automatic code generators displacing entry level workers.  If the CEO of a high-tech startup can't see his way past that, he should go ahead and retire now.  Automatic code generators will produce high-quality, well-documented code that does exactly what the specs said.  Know where that leaves the average programmer?  Happy.

I spend the vast majority of my time finding simple errors in deep code.  An inverted IF statement, for example, or a + that should have been a -.  Sometimes it's wrong in the spec, but most of the time it's a simple misunderstanding or just a plain old brain fart.  Irrespective the cause, these things take a great deal of time to find, and I'm not cheap.

Automatic code generators will, in all probability, significantly reduce the amount of time I and others like me spend doing this sort of debugging.  Whatever will I do with all my free time?  Might I spend it doing things the computer can't do -- things that require imagination and the ability to think abstractly?  The code is not the Program any more than the nails and wood are the home.

It means the OS and language wars are finally over.  It means that colleges and universities can stop teaching useless things like language and OS syntax, and focus on logic and algorithmic development.  It is the evolution of the programmer from code monkey to Engineer, and it is time.

If Jack Dorsey can't see this, he doesn't deserve his position.  CEOs, especially tech CEOs, must have vision, and statements such as these demonstrate his vision is severely limited.


Yeah, as another experienced developer, I also really agree with this post.

Most of my time writing software...is spent figuring out what I want it to do, not actually typing code. Any tool that can write more of the code for me automatically...just increases the power of what I can do, but it doesn't figure out if I'm solving the right problem or not.

I do agree that a lot of the more junior or otherwise mediocre developers may need to worry about their jobs, if they aren't able to adapt to solving bigger problems than just "typing out code". Those kind of people will probably be needed less and less in the future, and will probably be paid less and less, too. I do think that "code monkeys" need to worry.

But as a more senior dev, I'm happy to get those people out of the industry. Most of the time they are actually a productivity drain on me because I have to hand-hold them or spend a lot of time finding their mistakes, refactoring their poorly planned out code, etc.

I'm still skeptical though of "AI" as a total replacement for programmers though. That claim has been made before and it's not really panned out. I do think it will make more inroads in certain types of areas (optimization, classification, etc...problems that are amenable to big data). I don't think it will replace the entire job.
 
2020-05-24 2:21:22 PM  

unixpro: Fair enough, but (1) I'm talking about an ideal, not the actual, and (2) we don't require music students to know how to build a piano, or even how to tune one.


I'm not sure exactly where that analogy lands, but neither do we require CS students to know how to build a computer, or do much of anything at the hardware level.  A typical CS degree has a single course on architecture, and it's mostly at a level of "here's how it works from 10,000 feet up, featuring the parts that will matter to you as a software person."

Writing code is much more like playing a piano rather than building or tuning one, and both involve mastery through time spent in practice.  Indeed, music programs give us a very important analogy for CS programs.  Early CS programs were put together by math and science faculty, and were very coding-heavy; later they were taken over by CS faculty who wanted to focus on the abstract and fascinating stuff more, viewing coding as tech-college stuff.  That's fine, but without lots of coding it runs the risk of creating students who learn ideas without technical mastery, ideas that they can't apply or even grasp at a concrete and practical level.
 
2020-05-24 2:23:37 PM  

DeArmondVI: I work on both production support and developing updates for an enterprise level billing system for insurance companies.

No machine learning algorithm is ever going to predict, let alone code for, how users will use the tools made available. What's that worn joke about building a foolproof system?

As long as humans are interfacing with the software, you are going to want humans developing it.


What I'm imagining is a world where a PM just maintains a giant XML or JSON file that specifies the intended behavior of the software, then the AI figures out how to actually implement it.

That said, for all of our bullshiat, the ML systems I've seen in real life are nowhere near that level.
 
2020-05-24 2:26:15 PM  

Xcott: unixpro: Fair enough, but (1) I'm talking about an ideal, not the actual, and (2) we don't require music students to know how to build a piano, or even how to tune one.

I'm not sure exactly where that analogy lands, but neither do we require CS students to know how to build a computer, or do much of anything at the hardware level.  A typical CS degree has a single course on architecture, and it's mostly at a level of "here's how it works from 10,000 feet up, featuring the parts that will matter to you as a software person."

Writing code is much more like playing a piano rather than building or tuning one, and both involve mastery through time spent in practice.  Indeed, music programs give us a very important analogy for CS programs.  Early CS programs were put together by math and science faculty, and were very coding-heavy; later they were taken over by CS faculty who wanted to focus on the abstract and fascinating stuff more, viewing coding as tech-college stuff.  That's fine, but without lots of coding it runs the risk of creating students who learn ideas without technical mastery, ideas that they can't apply or even grasp at a concrete and practical level.


Also agree with this a lot. I've run into this more and more in new young software dev hires. A lot of them understand a lot of big concepts like AI, machine learning, etc...but cannot actually mechanically put a program together without lots of handholding and assistance. All that high level knowledge is useless and can't be applied if they haven't also put in the thousands of hours of practice required to grasp all the "nuts and bolts".

The best devs are the ones who both understand the big picture concepts and are still hands-on enough to actually build their concepts and get all the details right.
 
2020-05-24 2:29:24 PM  

zang: DeArmondVI: I work on both production support and developing updates for an enterprise level billing system for insurance companies.

No machine learning algorithm is ever going to predict, let alone code for, how users will use the tools made available. What's that worn joke about building a foolproof system?

As long as humans are interfacing with the software, you are going to want humans developing it.

What I'm imagining is a world where a PM just maintains a giant XML or JSON file that specifies the intended behavior of the software, then the AI figures out how to actually implement it.

That said, for all of our bullshiat, the ML systems I've seen in real life are nowhere near that level.


And that is...programming. It's still a language used to specify computer behavior...it's just a declarative language rather than an imperative one. That is not at all a new concept (see, e.g., Prolog, or even SQL). You can call it a "PM"'s job, or you can call it programming, but if you are creating specification then you are developing software. The language used to specify it doesn't change that.
 
2020-05-24 2:30:04 PM  
We may very well end up with systems writing systems. It just means programmers will move on to writing the systems that verify the behavior of the systems the system wrote.

This doesn't sound cheaper to me.
 
2020-05-24 2:31:48 PM  
Speaking as a software engineer and now a project manager, if the AI writes exactly to spec then it won't do anything the customer or the product manager/sales wants.  This sounds corny, but creating software, like any product, is collaborative.  There are so many things that spec writers don't think of.

Jack is just thinking that since 2/3rds of Twitter is just bots retweeting other bots that he can have bots grabbing bits of code from other bots to create a cohesive product.

Bots will be writing all the code around the time bots will be replacing carpenters.
 
2020-05-24 2:36:02 PM  
All engineers in air gapped government work raise your hands! You're immune kids! But you have to work in office  during pandemics. Trade offs.
 
2020-05-24 2:37:51 PM  
Weren't they saying that about IDEs having wizards and whatnot?

/   Complete and utter twaddle.
//  What AI is going to write a driver module?
/// Who's going to trust an AI to make software for a pacemaker?
 
2020-05-24 2:38:46 PM  

Rapmaster2000: Speaking as a software engineer and now a project manager, if the AI writes exactly to spec then it won't do anything the customer or the product manager/sales wants.  This sounds corny, but creating software, like any product, is collaborative.  There are so many things that spec writers don't think of.

Jack is just thinking that since 2/3rds of Twitter is just bots retweeting other bots that he can have bots grabbing bits of code from other bots to create a cohesive product.

Bots will be writing all the code around the time bots will be replacing carpenters.


Beyond just that (and I agree), there is also the problem that most non-devs don't actually know what they want from software. They mostly just have a vague idea in their head of what they want but they don't know any of the details. And even if they do know what they want they lack the language skills to be able to precisely specify and communicate what they mean. Most are not even used to thinking in a systematic or precise way.

That's part of why I'm very skeptical that AI can replace human development...most non-technical people are not great at saying what they want.
 
2020-05-24 2:39:44 PM  

unixpro: Might I spend it doing things the computer can't do -- things that require imagination and the ability to think abstractly?



Not if your stuck in one of those loud-ass open offices you won't.
 
2020-05-24 2:44:59 PM  

unixpro: If Jack Dorsey can't see this, he doesn't deserve his position. CEOs, especially tech CEOs, must have vision, and statements such as these demonstrate his vision is severely limited.


The vision of what this does to share prices and investor confidence is extremely clear. That's the only thing a CEO needs to be able to manage.

In any case, ML systems aren't going to replace programmers any more than they're going to replace anything else. At most, they're force multipliers, and at worst the entire industry is snake oil, and the last one is mostly true.
 
2020-05-24 2:47:46 PM  

Mnemia: zang: DeArmondVI: I work on both production support and developing updates for an enterprise level billing system for insurance companies.

No machine learning algorithm is ever going to predict, let alone code for, how users will use the tools made available. What's that worn joke about building a foolproof system?

As long as humans are interfacing with the software, you are going to want humans developing it.

What I'm imagining is a world where a PM just maintains a giant XML or JSON file that specifies the intended behavior of the software, then the AI figures out how to actually implement it.

That said, for all of our bullshiat, the ML systems I've seen in real life are nowhere near that level.

And that is...programming. It's still a language used to specify computer behavior...it's just a declarative language rather than an imperative one. That is not at all a new concept (see, e.g., Prolog, or even SQL). You can call it a "PM"'s job, or you can call it programming, but if you are creating specification then you are developing software. The language used to specify it doesn't change that.


I think we can both agree that when a PM hands me a 1-pager spec for a feature, that's not programming.  Suppose they hand the same spec to an AI - is that programming?  I would say it's not.  Now if the spec was written in a standardized form that was more machine readable, but it conveyed exactly the same information, you're calling that programming?
 
2020-05-24 2:50:10 PM  

t3knomanser: unixpro: If Jack Dorsey can't see this, he doesn't deserve his position. CEOs, especially tech CEOs, must have vision, and statements such as these demonstrate his vision is severely limited.

The vision of what this does to share prices and investor confidence is extremely clear. That's the only thing a CEO needs to be able to manage.

In any case, ML systems aren't going to replace programmers any more than they're going to replace anything else. At most, they're force multipliers, and at worst the entire industry is snake oil, and the last one is mostly true.


Yeah. It's mostly just working at a higher level of abstraction, which might mean that people can create larger scale things. But that actually might require more expertise, not less, because it's also more powerful and more dangerous.

I think some of these CEO types are just blinded by their wet dream of not needing to pay smart people anything anymore. That biases their thinking about it. My experience with tools and frameworks that work at higher levels of abstraction has been that although they may "save time"...they don't actually eliminate the need for expertise. They just move that need to a different problem. They're just bigger building blocks.
 
2020-05-24 2:52:54 PM  
good luck getting those Ai to understand 30-year-old legacy coding, they would probably commit suicide, or genocide trying to figure that stuff out
 
2020-05-24 2:55:02 PM  

zang: Mnemia: zang: DeArmondVI: I work on both production support and developing updates for an enterprise level billing system for insurance companies.

No machine learning algorithm is ever going to predict, let alone code for, how users will use the tools made available. What's that worn joke about building a foolproof system?

As long as humans are interfacing with the software, you are going to want humans developing it.

What I'm imagining is a world where a PM just maintains a giant XML or JSON file that specifies the intended behavior of the software, then the AI figures out how to actually implement it.

That said, for all of our bullshiat, the ML systems I've seen in real life are nowhere near that level.

And that is...programming. It's still a language used to specify computer behavior...it's just a declarative language rather than an imperative one. That is not at all a new concept (see, e.g., Prolog, or even SQL). You can call it a "PM"'s job, or you can call it programming, but if you are creating specification then you are developing software. The language used to specify it doesn't change that.

I think we can both agree that when a PM hands me a 1-pager spec for a feature, that's not programming.  Suppose they hand the same spec to an AI - is that programming?  I would say it's not.  Now if the spec was written in a standardized form that was more machine readable, but it conveyed exactly the same information, you're calling that programming?


Essentially, yes.

It's not like computers actually execute languages like C#, or Java, or even assembly language, directly. All of those things are just abstractions that let people interact with the machines at a level that is closer to human comprehension, in order to specify what we want them to do. A standardized, machine readable specification language, is a programming language, for all intents and purposes. Like I said, not all languages are command-based. The example I gave (SQL)...lets you specify what you want back from a database. The database engine then figures out what algorithms to apply to efficiently retrieve that. It's not really different from what you are saying. The hard part is figuring out what you want, not how to do it.
 
2020-05-24 3:10:57 PM  
Just so it's clear...I'm not saying that a high level specification language is necessarily a programming language in the traditional, Turing-complete sense. I'm using the term more broadly in the sense of "language humans use it to communicate their desires to a machine".
 
2020-05-24 3:18:23 PM  

Chief Superintendent Lookout: Back when everyone in the US started sheltering in place, their shopping patterns changed dramatically. That changed jacked up the AI algorithms for Amazon and many others.


And? Human decision-making is equally jacked up. Maybe worse because people cling to biases when the inputs change.
 
2020-05-24 3:18:49 PM  

Xcott: johnphantom: Math can't deal with infinity,

Um, that's like saying "English doesn't have a word for 'purple'".

Mathematics has many devices for dealing with infinity, developed over centuries, and address infinite concepts very rigorously.  We can "deal with infinity" as well as we can deal with numbers or deal with space.


Then explain what Professor Steane wrote.
 
2020-05-24 3:23:42 PM  
When failure to use the test environment gets really serious
 
2020-05-24 3:25:56 PM  

zang: Mnemia: zang: DeArmondVI: I work on both production support and developing updates for an enterprise level billing system for insurance companies.
...
I think we can both agree that when a PM hands me a 1-pager spec for a feature, that's not programming.  Suppose they hand the same spec to an AI - is that programming?  I would say it's not.  Now if the spec was written in a standardized form that was more machine readable, but it conveyed exactly the same information, you're calling that programming?


As far as I know, chip design typically does exactly this (obviously far, far more than 1 page).  The code given can be used to emulate the chip, and the chip designers' job is to convert that into something that can be synthesized into a chip.  At least the original (and open) languages were the same, although don't expect to use any object oriented features in your synthesizable code.  The big advantage is that you can then go and verify that the chip returned can do exactly what the emulated chip is supposed to do, and do it within the timing limits (the hard part).

I'd assume the software analogue would to hand an "AI" some working software and tell it to start refactoring.  Give yourself the gift of premature optimization when the real issue is non-scalable architecture...
 
2020-05-24 3:28:02 PM  

grimlock1972: an A.I. might be able to write the code for behind the scenes stuff that make a piece of software function but they would be useless where real creativity is required.   Also i suspect more often than not an A.I. written code might be functional but it would leave many places for refinement and improvement. Also Bugs.


What, in your opinion, is the cause of creativity?
 
2020-05-24 3:29:39 PM  

mcreadyblue: skinink: unixpro: Software is stupid, and so is Jack Dorsey.

...Computers do *exactly* what they're told to do, and *nothing* else.  They do not see side-effects that could turn into beneficial discoveries -- they see errors and data that must be discarded or repaired...

Years ago when I had begun assembling my own desktops and troubleshooting them, I remember reading something similar on a troubleshooting site. But it went a little further and to paraphrase, "computers only do what the programmers have told them to do. Troubleshooting is easier when you realize that you have to find out how the computer was told to do something by the programmer".

At least, that works for me because when a computer does something stupid, I realize it is the coders who wrote something that (inadvertently) caused the issue. I wish I could find the article which explains it much better than I have.

737MAX is a good example of the result of software written by people who did not understand how airplanes work and airplanes flown by pilots that didn't understand how the software worked.


Not so to the first part... MCAS, feature-wise, works just fine. The failures there were getting proper (or any) training to the pilots and lack of use of the secondary AoA.
 
2020-05-24 3:32:28 PM  

Marcus Aurelius: Software won't be relevant any more?  That work is getting farmed out to the low wage countries, Jack.  And you know it.


GOOD! I am so tired of code-jockies acting as if they're superior. Thanks for your Pepe and Groyper 4-Chan crap! It's sooooo hilarious watching The Normies squirm. Cruelty is the point! Let's hang out at cruel dot com and rotten dot com and then talk about Fortran! Bunch of malevolent slimebags the lot of them. I already LAUGH at the code-jockies who once made half-a-million bucks a year from spitting out a few lines of code that are now making less than school teachers. Good. Farking. Riddance.

Get a job!

/ A real job!
 
2020-05-24 3:44:56 PM  

kkinnison: good luck getting those Ai to understand 30-year-old legacy coding, they would probably commit suicide, or genocide trying to figure that stuff out


Yeah...this is another really important point. There is a world of difference between shiny greenfield development with all the latest fancy tools, and the work that a vast number of software developers actually do every day (which is the maintenance of creaky old legacy software and systems). Lots of things that would be "easy" in a more modern environment are very hard in the environment that people actually work. We haven't even killed off COBOL or Visual Basic, for farks sake. These things have a tendency to stick around, complicating things, for much longer than intended. It doesn't help that large businesses frequently underestimate the impact that legacy issues have.
 
2020-05-24 3:46:51 PM  
Sooner or later, sure, we'll probably have self-programming programs.

Guess I'd better just continue to make my money now.
 
Displayed 50 of 125 comments


Oldest | « | 1 | 2 | 3 | » | Newest | Show all


View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter




In Other Media
  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.