If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Telegraph)   How to code, step one; learn to ^c^v   (telegraph.co.uk) divider line 175
    More: Advice, algebra, baking sodas, maths, digitizations, calculus  
•       •       •

6578 clicks; posted to Geek » on 09 Dec 2012 at 5:25 PM (2 years ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



175 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest

Archived thread

First | « | 1 | 2 | 3 | 4 | » | Last | Show all
 
2012-12-10 02:17:08 AM  

Jay CiR: Heh,

I started teaching myself PHP and ASP just because I had some web-based ideas, and I wanted to figure out how to make them happen. Since then, I've been working at coding more efficiently, whenever I can. It's been a great learning experience, but, having started in, and working only within, a web browser-only medium (I already had several years of HTML under my belt when I started with PHP and ASP), and speaking to other developer friends about some of the work they do (Ruby on Rails, for one, comes up a lot), I feel like I'm missing out on something (actually, a lot more than just "something").

Any recommendations for a curious beginner like myself, looking to break out of web-oriented programming?


Learn one high level (C) and one low level language (assembly, be it ARM or x86 or whatever). Get a solid foundation in data structures and algorithms first - where you really understand what the system is doing. Then learn a language with object oriented extensions (e.g. C++).

Remember, object oriented programming is a design methodology, not a language. You can do object oriented programming in pretty much any language. I also wouldn't recommend starting out with C++. It has all kinds of potential ugly habit generators (e.g. try/catch/throw blocks, declaring variables wherever, etc...).
 
2012-12-10 02:34:38 AM  

Doc Ok: I was a self-taught programmer during high school, and wrote and sold (small) business software. I thought I was the hottest stuff ever. Then I started information science / computer science at university, and realized that I was total shyte before.

So if anybody tells me that being a good programmer doesn't require any format education or knowledge, I nod politely and ignore them afterwards.


It doesn't require a formal education, but that certainly helps. I think the problem with 'self-taught' is that people read the beginners books and learn how to program, but never learn how to be a programmer. It's basically the same way that you can learn to write, but that is a long way from learning to be a writer. If literacy were all it took to be an author, we'd all be Hemingways. Likewise, being able to read and write code is the first step on the path to being a programmer, not the last. I think too many people don't see that, they reach a point where they think they're "done".

Further education will help, or many, many years of trial, error, and introspection (i.e. the wisdom to realize when you've coded yourself into a corner, and the imagination to see how you should have done things differently). But there are still good self-taught options, you just need to be committed to self-improvement.
 
2012-12-10 02:49:56 AM  

Jay CiR: Any recommendations for a curious beginner like myself, looking to break out of web-oriented programming?


One reason I like the Microsoft .net framework is that it runs in a whole load of ways. You can build console applications, desktop applications, services, web services, websites and they have a load of supporting libraries for each. I can apply my c# skills to any of them, import one of my own libraries (e.g. for validation) from a web project into a service.

Sure, it means you are basically stuck on Microsoft kit, but for me, that's a small price to pay for all the goodies I get.
 
2012-12-10 03:10:12 AM  

Happy Hours: Swoop1809: Rockstone: Swoop1809: Hopefully I transition back into development in the near future, preferably not in COBOL which is what I am doing business analysis for, once our services are re-developed into Java.

People keep telling me we don't use COBOL anymore, but I know that isn't true.

COBOL is still everywhere large amounts of data need processing. Banks, UPS, Amazon all have mainframes that utilize COBOL. It isnt going away in the near future either. The company I work for is phasing it out because no one my age (23) learns it anymore, we learn object oriented languages, and the old guard developers are getting expensive. . Most COBOL developers we have are contractors or offshore.

Interesting - I've never touched COBOL and haven't even been aware of it being used anywhere I've worked in the last decade or so.

I have a very bad impression of COBOL programmers. It's probably not fair for me to think of them all that way, but the ones I did deal with in the past were absolute morons and lazy as well.


The reason why most COBOL programmers are morons are because they either don't want to put the effort into learning a harder language or they've burnt so many bridges that employers will only keep them around because they know a dinosaur language. COBOL is way more simplistic than qBasic and anyone can learn COBOL... nobody wants to though.
 
2012-12-10 03:11:24 AM  

Jay CiR: Heh,

I started teaching myself PHP and ASP just because I had some web-based ideas, and I wanted to figure out how to make them happen. Since then, I've been working at coding more efficiently, whenever I can. It's been a great learning experience, but, having started in, and working only within, a web browser-only medium (I already had several years of HTML under my belt when I started with PHP and ASP), and speaking to other developer friends about some of the work they do (Ruby on Rails, for one, comes up a lot), I feel like I'm missing out on something (actually, a lot more than just "something").

Any recommendations for a curious beginner like myself, looking to break out of web-oriented programming?


GIANT WALL OF TEXT WARNING.

(Keep in mind that I'm a college senior with a couple internships, so 1)the older guys might have better advice, and 2)this all comes with a big "In my limited experience" caveat)

My experience has been that there are 2 camps of programming languages (and I'm generalizing like MAD). Scripty (of which web is a subset, so this is your LISP, Python, Perl, Ruby (on rails), JS, HTML, etc.), and Heavy (mostly languages that evolved from C, so C/C++, JAVA, C#, etc). Scripty is usually an interpreted language that runs inside of something else, is extremely flexible, weakly and dynamically-typed, and very agile. Heavy is usually a compiled language, with odd constraints because of that. Heavy also lends itself well to big, carefully-architected projects based around OOP. .They're also way less flexible (Try doing a Ruby Hash in C++, for example). They both work, and they both have different metaphors, and canonical ways of writing code.

You already have a fairly strong basis in the scripty stuff, but most of the not-web stuff is written in heavy languages. So (and keeping in mind that I'm giving you the same advice as a freshman, who knows nothing about coding, so you may already know some of this):

0) Get out of the IDE's for at least steps 1-3.

Pull up a Linux VM (Mint is quite good and easy to learn), write in Emacs/Vim (And there will be an UNHOLY flamewar about which of these to use. I use vim/gvim because my mentor used vim. If you use vim, pull up vimtutor in the command line FIRST to learn WTF you're doing. Either way, my personal advice is to make your desktop background and screensaver lists of useful commands for your particular editor until you've learned all the commands), compile with command line gcc/g++ and makefiles, and run from the command line. When you start using malloc()/dealloc() or new/delete, run everything in Valgrind to make sure you have no memory leaks. Do it yourself, so you know what's going on.

1) Learn a couple of heavy languages.

C/C++ would be my preferred one. C++ syntax sucks a lot, but having to explicitly use pointers and references vs. values, overload operators when necessary, and manage your own memory is really critical to understand what is going on at a low level, and JAVA/C# abstract a lot of that stuff away as part of being managed code (esp. the all-important memory management).

Write some pure C on the side to do some procedural stuff (It really is a different paradigm than OOP), realize just how dangerous pure C is, and how awesome C++ with the standard library (esp. C++ 11 with all the new stuff and extra standard libraries as part of the language) really is (std::string vs. null-terminated char *, vector vs. array., templating vs. unsafe void * casting, etc).

As part of this, learn how to use const properly in C++. There's like 4 places const can go in a function definition, and it's REALLY, REALLY HARD to kinda use const, because the compiler tends to throw a fit (that you often can't do a thing about or even figure out what it's yelling at you about in the first place), so you either don't use it at all, use it correctly from day 1 of your project, or devolve into a giant muddle of calls to const_cast.

2) Learn pointers. Know Pointers. Love Pointers. (Start using C here)

I had 15 different interviews this fall ask me the question about how to reverse a singly-linked list (which is really easy if you get pointers, and really insane if you don't). Learn what the difference is between Clock * c1ptr = c2ptr, and Clock c1 = c2; Be able to think at multiple levels of abstraction easily. Be able to avoid segfaults (and remember that segfaults are a good thing. The alternative is messing up your heap and stack, and you NOT KNOWING, and getting very, very subtle bugs as a result).

You either can do this or you can't. If you can't do this, give up and go back to web programming, because you'll never be a good "heavy programmer" since EVERYTHING is based around this.

While you're doing this, learn Angry C (also known as "WTF" C). Know all the stupid tricks about Operator Precedence and how to use them. Know why "while(*dest++ = *src++);" copies a null-terminated c-string. Use the ternary operators. Even better, use nested ternary operators (1 == x ? foo() : 2 ==x ? bar() : baz();). Put assignment in a while loop's condition (And then after you do that by accident a few times ('=' vs '=='), ask yourself why I did 'value' == 'variable'). Use case fallthrough in interesting ways. Try and do as much as possible in one line of code. (Don't keep doing any of this beyond this step though. Even if it works, your fellow programmers will lynch you. Think of it as a rarely-used tool in the toolkit. Just because you CAN use the comma operator or Duff's Device doesn't mean you SHOULD.).

I'd also write a couple of type-independent data structure implementations using void * just to learn how to deal with casting and the utter lack of type-safety.

3) Learn algorithms and data structures. (Start using C++ here)

If at all possible, do this in a formal setting, like a college course where you have to implement various data structures on your own.

For algorithms, LEARN BIG-O. Learn why O(1) vs. O(lg n) vs O(n) vs. O(????) matters. Know the Big-O time and space complexity of every algorithm you run across. Know how to at least kinda guess at the Big-O of an algorithm. KNOW ALL THE SORTS (lot of interview questions here. Know quicksort, merge sort, insertion sort at minimum, and be able to crank out code on command). Learn how to optimize your code for time and/or memory.

For data structures, write your own singly-linked list, doubly-linked list, vector, map, set, unordered_map (also known as hashtable), and deque classes in C++ with pointers and manual memory. Make them all templated, and make them all have iterators. Learn their Big-O's for lookup, insertion, and removal, and know what that makes them good for. EVERY SINGLE INTERVIEW I had at companies that I would have wanted to work at had AT LEAST 1 question that was "Tell me about Data Structure X". The company that I accepted at had a 2-hour interview (as part of an all-day interview) that was nothing but me and a guy with a PHD in data structures shooting the breeze, and filling up 2 separate whiteboard walls with scribbles, notes, thoughts, etc. Data structures are utterly critical.

4) Learn OOP.

Inheritance; multiple (and diamond. Even though this is frowned upon, if you know how to make it not-screwy, it's a valuable, if seldom-used tool) inheritance; composition vs. inheritance; public, protected, private; abstract classes; virtual functions; dynamic casting, all that good stuff; This is the toolkit that you'll be applying when you...

5) Learn software architecture.

You probably already know this from your web experience to a certain extent. Learn the design patterns you don't know (Singleton, Factory, and MVC are probably your big ones here.). Know when to use them, what they're good for, etc. Know how to decouple a class from another class, and have discrete pieces of functionality in separate places in the code. BEST CODE IS NOT FASTEST CODE. BEST CODE IS MAINTAINABLE AND EXTENSIBLE CODE. Also, at some point in here, learn the various std::???_ptr classes. They're really cool, really powerful, really dangerous if used improperly, and this is about the point where you'll really stop having a single point in the code where you can say "Yeah, I'm done with that pointer, and it can be deleted there".

6) Possibly at some point, learn about multi-threading and/or assembly.

For multithreading, Locks, deadlock, livelock, the up and downsides of pure and readwrite locks. I wouldn't necessarily recommend writing your own threading library like I had to for class, but just learning the basic concepts, and how to use the canonical library for your preferred language probably isn't a bad idea.

I've honestly never, ever had to learn or use actual assembly. I had a class with a minimally Turing-complete assembly language for which I had to write several C implementations that implemented various versions of the architecture, and they taught us about pipelining (aka: Why Lots of Jumps are Bad), and some of the basics of architecture and microarchitecture, but you're probably going to be high enough that it's not really a huge concern. Now if you want to write compilers, go the embedded systems route, or do serious, serious optimization work, learn it and learn it well, but there are millions of programmers who have never used (or even heard of) assembly, and are totally fine, good, competent programmers.

7) When you're done with C++, go over and learn either C# or Java.

They remove a lot of stuff from C++, make the syntax a lot nicer, and add a couple things I really like. They also:
*Make everything an object, and pass around references, so that there are no pointers, period.
*Don't require manual memory deallocation, so that you can new() to your heart's content.
*Remove multiple class inheritance, but add these things called Interfaces, which are basically fully abstract classes.

This does some odd things to the metaphors that C++ was allowing you to use ("I am a" vs. "I can do"), which does some weird things to "the correct way of writing code" and knowing what these changes are is critical. It's a subtly different way of thinking. Nothing like the huge transition you're making from the scripty side, but it's there.

Resources:

The definitive C89 book: The C Programming Language, 2nd Edition (called Kernigan and Ritchie): Wikipedia Amazon

The definitive C++98 Book: The C++ Programming Language (3rd Edition) by Bjarne Stroustrup: Wikipedia Amazon

Note that new editions of the above books are coming out soon, because we just got new versions of the languages. (C11, and C++11)

The definitive Design patterns book: Design Patterns: Elements of Reusable Object-Oriented Software (also called Gang of 4): Wikipedia Amazon

My class's algorithm book: Introduction to Algorithms, CLRS: Wikipedia Amazon

The class I took that was really good at teaching C/C++/software architecture (and is a requirement to get looked at by Google) was EECS 381 at the University of Michigan. I believe this link is open to everybody, and for the next week or so, all the projects are up so save it all to your computer (The professor puts the projects up as the semester goes on). Project 1 is C, Project 2 is C++ without the standard library (You reimplement std::string and std::list), Project 3 is the standard library, and "algorithm" (You're not allowed to use loops at all), Project 4 is architecture, project 5 is smart pointers that auto-delete memory, project 6 is the final exam. Also, since I took it, I believe that he rewrite the course for C++11 (Link)
 
2012-12-10 03:49:44 AM  
I tried to learn C with this and all I ended up doing was flying around with the jetpack.
geekynicki.edublogs.org
http://geekynicki.edublogs.org/files/2009/09/ceebot.png
 
2012-12-10 04:26:11 AM  

JustSurfin: But - COBOL remains popular because it supports a "decimal" data type so financial calculations actually come out right. Try representing 0.33 in binary.


Essentially every programming language has support for fixed precision arithmetic; it may not be part of C89 but it's something you could copy and paste into your project without any hassle. And for applications like you present here, where the precision is small and consistent, it's pretty trivial to just use ints internally and scale the input/output.

No, as other have mentioned the reason COBOL is still in use is simply because early adopters of computer systems (finance, insurance, etc.) wrote programs that they've never replaced. There's still work in COBOL, but not for new systems -- it's 100% maintenance of ancient tech.

COBOL isn't better for the job, and in a lot of ways it's worse. For example, on most medical/dental claims there can be a maximum of 8 services per claim; more than that and they have to start a second claim. That's a limitation straight out of 1962 when some COBOL programmer wrote a fixed-position flat file with 1 claim per line and slots for exactly 8 service items. COBOL encourages that kind of programming; you don't need a database because COBOL treats all files a databases, but it treats them as really terrible single-table databases. In the intervening years some companies have wedged in DB2 or somesuch in place of actual files (though not all of them; I have clients with millions of medical claim records in such files) but the limitation remains because the program still thinks it's dealing with the same fixed-position files someone with a poor grasp of data management invented 50 years ago.
 
2012-12-10 04:57:13 AM  
Coding is simple and consists of only the following activity: pressing keys on a keyboard.
Of course, the trick is pressing them in the right order.

Or is it number arranging? In the end, a program is (and/or becomes, once compiled or processed) a long string of numbers. It seems that the actual arranging is the tedious job so we use high-level languages to describe how we want the numbers arranged.

Also, do programs really exist when they're not running?
 
2012-12-10 05:45:19 AM  

meyerkev: GIANT WALL OF TEXT WARNING.


Your comment is interesting to me, but I'm fairly drunk - I'm going to open it in a new tab and maybe read it tomorrow.

At first I was going to make some snarky comment like there are 2 types of programmers (Me and everyone else), but you might have something worth saying.
 
2012-12-10 06:05:41 AM  

midigod: FTA: "So in most cases you can see that the hard maths (the physical and geometry) is either done by a computer or has been done by someone else. "

Wow, what great career advice. "Don't worry, someone else will do the hard part." Holy crap.


Isn't that basically the mantra of every web developer? :P
 
2012-12-10 06:42:03 AM  
MrEricSir
SineSwiper:
If you don't need portability, STOP USING JAVA!

Portability isn't a good reason to use Java. Not even in the top 10.


Seriously. Who is using Java on the server mainly for portability reasons? 

/funnily enough - given Java's reputation - I sometimes prefer to use it on the web for performance reasons
 
2012-12-10 07:19:19 AM  

Roman Fyseek: hershy799: only ran in serial.

My office runs this massively parallel computing thingy. I was tearing my hair out over how the server knew when to feed more data to all the processes and when I finally asked the guy that wrote it, I felt stupid.

(hint: the server doesn't feed to the processes. The processes ask for more when they're finished with their current batch)

It's little shiat like that that will drive coders nuts and leads to wildly inefficient programming. Seriously, teaching yourself to code is easy. Learning to code right requires a qualified instructor.

Or, being Babbage or something.


Or a logical brain with forethought.
 
2012-12-10 07:23:14 AM  

Professor Science: learn to ^c^v

Only be sure always to call it please "research."


Imitation is the sincerest form of programming.
 
2012-12-10 07:32:50 AM  

rosebud_the_sled: //SQL is not programming


I agree, but there are some SQL statements of such incredible length and complexity that I have dealt with that it's pretty close once in a while.

meyerkev: GIANT WALL OF TEXT WARNING.

(Keep in mind that I'm a college senior with a couple internships, so 1)the older guys might have better advice, and 2)this all comes with a big "In my limited experience" caveat)


A lot of that stuff you mentioned is unnecessary depending on what type of programming job you're going after. There are thousands of business logic programming jobs out there that not only will you not be asked in an interview about pointers or data structures, but the interviewers themselves will find it hard to recall how to manage a linked list. C# and VB.Net type jobs with a lot of db calls are what I spent years doing and it was mostly pretty straight forward. OOP is used, but often not extensively and at times its usage is contrived in business apps.

Really for most businesses I worked with they never touched C++. The time it took for development was too costly and with almost no pay off for type of applications they wanted.
 
2012-12-10 07:33:08 AM  
CSB time: I was working on a project in the recent past in which a previous coder (I make a distinction here between coder and programmer) had copy/pasted the same code 40 times within the same function, making small changes each time. I about fell out of my chair when I saw it. So I made a few tweaks to the underlying data structure, rolled the whole thing up into a nice little loop, and reduced the function size by about 2000 lines. Of course, this is the same project where we were forbidden to use pointers (at first), so there was a lot of passing large data structures around as function arguments. And they wondered why the code was so slow...

You see that kind of thing a lot in military applications, where many of the folks come from a background in ADA and have a hard time transitioning to C/C++ - in another project, we were pressured by the customer to hire a bunch of contractors, because for some reason headcount = progress. We wound up getting about two good programmers out of it, and the rest just caused more problems than they solved.

\ Had a professor once who was fond of saying "CPP does not stand for copy/paste programming!"
\\ Also, "Don't try to outsmart the compiler"
 
2012-12-10 07:34:22 AM  

Sum Dum Gai: Doc Ok: I was a self-taught programmer during high school, and wrote and sold (small) business software. I thought I was the hottest stuff ever. Then I started information science / computer science at university, and realized that I was total shyte before.

So if anybody tells me that being a good programmer doesn't require any format education or knowledge, I nod politely and ignore them afterwards.

It doesn't require a formal education, but that certainly helps. I think the problem with 'self-taught' is that people read the beginners books and learn how to program, but never learn how to be a programmer. It's basically the same way that you can learn to write, but that is a long way from learning to be a writer. If literacy were all it took to be an author, we'd all be Hemingways. Likewise, being able to read and write code is the first step on the path to being a programmer, not the last. I think too many people don't see that, they reach a point where they think they're "done".

Further education will help, or many, many years of trial, error, and introspection (i.e. the wisdom to realize when you've coded yourself into a corner, and the imagination to see how you should have done things differently). But there are still good self-taught options, you just need to be committed to self-improvement.


I'm self-taught. Over a period of 15 years, and then 15 years of work experience. I've read, absorbed and applied hundreds of books. Worked with and learned from dozens of coders. Especially the bad ones.

Self-taught is possible. However I don't recommend it!
 
2012-12-10 07:41:27 AM  

MayoSlather: blah


Oops. Proper line spacing would have made that easier to read.

/It's early
 
2012-12-10 07:45:41 AM  

JustSurfin: Happy Hours: jimmiejaz: 10 REM "Another green for me"
20 PRINT "Green"
30 for t=1 to 300
40 CLS
50 next t
60 goto 20
1000 END

Let's rewrite that using copy/paste methods:

10 REM "Another green for subby"
20 PRINT "Green"
30 CLS
40 CLS
50 CLS
60 CLS

You get the idea.

I guess the line numbering n BASIC really farks with copy/paste

Let me rewrite that in COBOL for you.... Naw it would take a whole page of code.

But - COBOL remains popular because it supports a "decimal" data type so financial calculations actually come out right. Try representing 0.33 in binary.

  

If float math doesn't meet that requirement there is always bit shifting.
 
2012-12-10 08:53:45 AM  

farkeruk: Do you know how to tell which plugins are worth using?


This is a skill unto itself. Every time I present a tool to my team, it's at the tail end of 40+ hours of research, spiking 2-3 different technologies. Then I try to make a change to the proof of concept, and see how easy that is. Then I put it on my personal site and see if it works from my phone, then leave it up for a couple of weeks to see if it crashes. Then I present it.

A good modern developer is a good integrator. Sometimes the solution isn't code, it's a tool, or service or library or platform. The best code is the code you don't write at all!
 
2012-12-10 09:03:35 AM  
Back when I used to teach programming, I used to tell my students, "Bad software lives forever." There will never, ever be enough time or money to go back and do it right once that crappy initial version gets into production.

It was also one of my big reasons for hating Visual Basic. Time was (back when we wore onions on our belts) that you had to be a decent programmer even to write bad software. Then Visual Basic came along, and suddenly almost anyone could crank out bad software. The utterly predictable result of this was a landscape that was littered with terrible, awful, horrible software.

I once worked for a money management firm that managed billions--literally billions--of dollars of other people's money with apps that were Visual Basic 6.0 on top of Sybase/Unix. That damned front end was so brittle and fragile that you couldn't write a simple enhancement without the side effects breaking something completely unrelated. And they were stuck with it, because nobody was going to shell out the money to replace it.
 
2012-12-10 09:34:34 AM  
hello world?
 
2012-12-10 09:35:19 AM  

Cybernetic: Back when I used to teach programming, I used to tell my students, "Bad software lives forever." There will never, ever be enough time or money to go back and do it right once that crappy initial version gets into production.

It was also one of my big reasons for hating Visual Basic. Time was (back when we wore onions on our belts) that you had to be a decent programmer even to write bad software. Then Visual Basic came along, and suddenly almost anyone could crank out bad software. The utterly predictable result of this was a landscape that was littered with terrible, awful, horrible software.

I once worked for a money management firm that managed billions--literally billions--of dollars of other people's money with apps that were Visual Basic 6.0 on top of Sybase/Unix. That damned front end was so brittle and fragile that you couldn't write a simple enhancement without the side effects breaking something completely unrelated. And they were stuck with it, because nobody was going to shell out the money to replace it.


Very true. However VB 5/6 were always flawed though, and prone to breaking on non-programmer related issues. Sometimes function calls on basic controls just didn't work right, or sometimes updates to the operating system would screw third party controls or custom .dll's. If network admins weren't communicating they could be a development team's worst nightmare. It was a fragile development platform that produced fragile software. Those were different times though, instability was common on windows based client/server development, and the fact that VB 5/6 produced unstable software wasn't a surprise.
 
2012-12-10 09:44:44 AM  
There's a big difference in knowing how to code and being a programmer. I know the Chemical Element chart but that does not mean I am ready to be a chemist.
 
2012-12-10 09:46:50 AM  

traylor: My current title is "chief software engineer", I'm leading a small group of developers. 12 years of experiment in image processing and neural networks (artificial intelligence if you like), many developers passed under my umbrella and I have to say, the worst kind of them were those who were good at teh maths. Why? Because they were not creative. They were thinking in equations and not in code. When they implemented a solution for a problem they could not see the shortcomings, they lacked any critical thinking because they had the mathematical proof that they had done everything correctly. They lacked the ability to make one step further and make the solution any better than it was written in the research papers and studies.

Many of them had the habit to copy-paste some code from teh net, which was also a sign of the lack of critical thinking. It may be a good practice when you are implementing industry standards, but not when you need to deeply understand the code, and not when you are expected to make something that is better than what is out there.

Of course it is an advantage if you are good at the maths. But if it is all what you've got, you should not be coding. And forget ^c^v, it will only bring problems on the long term.


Then they actually weren't good at math. The way we teach math in our country is not based on problem-solving (which is how we teach CS), so people can get straight A's in undergrad math without ever straying from the algorithms/methods given in class. Take some one who is ACTUALLY good at math and they will be a superb programmer.
 
2012-12-10 10:01:32 AM  
What this world needs is more people who think that coding is easy, and go into the industry with no real aptitude for it.

I call this "job security", as in contrast to these people, my middling skills make me look like a super star!
 
2012-12-10 10:04:49 AM  

lc6529: There's a big difference in knowing how to code and being a programmer. I know the Chemical Element chart but that does not mean I am ready to be a chemist.


What are you talking about? The project manager's kid coded a complete UI in 2 days. If he can do it, why can't you?
 
2012-12-10 10:11:04 AM  

lc6529: There's a big difference in knowing how to code and being a programmer. I know the Chemical Element chart but that does not mean I am ready to be a chemist.


Exactly... this article should be called "how to become that guy on the dev team that everyone hates".
 
2012-12-10 10:16:27 AM  

Dinjiin: JustSurfin: That was fun. Java punk. You want my job, you got a lot of studying to do.

The stupid thing is, since Java is derived from the C programming language, it wouldn't be hard for him to learn other derivatives such as ANSI-C, C++, C♯, Scala or Perl. Biggest issue would be the jump from object-oriented programming to procedural programming in ANSI-C and Perl.

If you want an alien landscape, try going from C to LISP. Holy fark, that was different.


Been there, done that.
 
2012-12-10 10:17:45 AM  

rosebud_the_sled: //SQL is not programming


Funny, I know a lot of programmers that can't write SQL for shiat.
 
2012-12-10 10:28:53 AM  

DeaH: As a person who writes those requirements...


Well, then I gotta ask, then why can't the customers just take the specifications directly to the software people, huh?
 
2012-12-10 10:36:43 AM  

burndtdan: rosebud_the_sled: //SQL is not programming

Funny, I know a lot of programmers that can't write SQL for shiat.


*raises hand*

I admit, I'm farking terrible at it. Anything more complex than a join or two and my brain turns to mush. I really don't get it. Thank Bog for entity framework and linq.
 
2012-12-10 10:47:36 AM  
The article is a bit misleading, it talks about learning to code and then seems to imply that programming itself is just a lot of copying and pasting. So where does the code come from that you are copying? An actual programmer with years of experience? So does this mean that a newbie armed with a Self Help book on programming and a text editor is suddenly ready to write a complete
web based transaction system?

Does the newbie even understand what he/she just copied and pasted?

I know how to change the oil in my car, does this mean I can now go be a mechanic?

Lets see an article entitled "How to be a manager" Step 1 - Learn how to golf......
 
2012-12-10 11:19:11 AM  

vossiewulf: mcmnky: *facepalm*

Logic is Math. Rantity rant rant...

I didn't say NO math, I said no math greater than third grade. Or maybe fifth grade. I know a zillion very logical thinkers who manage to do so without the use of differential equations, I guess you are unlucky. And you don't need trigonometry to profile code and based on experience, knowledge of what it's doing, and what it's supposed to do, know whether it's fast enough or as fast as it could be, and then drill down on the slow functions and improve them.

Dunno guy, I manage a large group working on code that generates revenue with numbers starting with a b, everyone is quite pleased with the quality and efficiency, and none of my guys needs to take a cosine of anything. The BI guys, that's a different story.

If you're working on code that's supposed to analyze complex data, involves physics, or any other number of math-intensive applications, yes you need good math guys. But as I said originally, there are large sectors of the coding world where that isn't the case.


*sigh*

Trigonometry is good for graphics, Algebra is good for understanding basic equation functions, but what every coder needs to know is Discrete Structures.Knowledge of recursion, NP complete, induction, etc... is vitally necessary for every code. All coders that I have ever encountered who could not even begin to explain the most basic of finite mathematics were always the worst coders I ever had the misfortune to work with.
 
2012-12-10 11:22:22 AM  

lewismarktwo: I tried to learn C with this and all I ended up doing was flying around with the jetpack.
[geekynicki.edublogs.org image 642x482]
http://geekynicki.edublogs.org/files/2009/09/ceebot.png


Bwhahahahahaha.

Now build a real robot with real motors, tires, drive train and shiat. You will quickly find out that pretty little turn (45), move(50) is more like a turn(37) move(23) when applied to the real world.
 
2012-12-10 11:33:20 AM  
burndtdan
rosebud_the_sled:
//SQL is not programming

Funny, I know a lot of programmers that can't write SQL for shiat.


In a way, from a theoretical perspective, one could say that SQL is closer to math than to programming.

I'm also one of those programmers who isn't too good with SQL, because I usually don't need a lot of it.
In most projects I've worked on, there's a lot of very simple CRUD stuff, i.e. most database stuff boils down to "select a,b,c from d where id=foo" or you don't even have to deal with SQL for simple persistence stuff like that anymore.

I mean, should I need stuff like cube, roll up or complicated subselects, I can look them up again and make things work.
But it will take a bit of time and fiddling around compared to e.g. some data analyst who works with databases a lot.


As the CS professor specializing in databases put it during the 30 minute long part about basic database stuff in one of my oral exams:
"I can see that you will arrive at the proper solution in finite time. However, fiddling around with SQL is neither the point nor very interesting, and we've got a lot of ground to cover and not much time. Now lets talk a bit about scheduling and serialization..."
 
2012-12-10 11:43:55 AM  
Once you learn the basics learn how to manage data between threads. A good 90% of crashes these days are because programmers don't properly handle data that passes between threads. The other 10% of crashes are memory leaks.

/i should know i work on code that was written by people who can't handle passing data between threads
 
2012-12-10 01:11:18 PM  

The Voice of Doom: burndtdan
rosebud_the_sled: //SQL is not programming

Funny, I know a lot of programmers that can't write SQL for shiat.

In a way, from a theoretical perspective, one could say that SQL is closer to math than to programming.

I'm also one of those programmers who isn't too good with SQL, because I usually don't need a lot of it.
In most projects I've worked on, there's a lot of very simple CRUD stuff, i.e. most database stuff boils down to "select a,b,c from d where id=foo" or you don't even have to deal with SQL for simple persistence stuff like that anymore.


I've known some programmers who are VERY good at SQL, and have a really deep understanding of things like query optimization. When you need someone like that (usually because your database performance is crap and is bogging down the entire system), they're worth their weight in gold.
 
2012-12-10 01:30:14 PM  

JustSurfin: Then he asked me how many programming languages I've used (I've been in the biz since the late 70s). I was still listing languages 5 minutes later when his eyes glazed over and his brain rebooted.


Remind me to kill myself if I ever get stuck next to you in a plane.
 
2012-12-10 01:51:59 PM  

Cybernetic: Back when I used to teach programming, I used to tell my students, "Bad software lives forever." There will never, ever be enough time or money to go back and do it right once that crappy initial version gets into production.


It was only a few years ago that I was still finding bad fixes for Y2K bugs. I wanted to kill someone. You just replaced the Y2K bug with a Y2.1K bug, you motherfarker. You know there's a solution to this problem that will work no matter what year it is, don't you?

"Oh come on, our software isn't going to be running 100 years from now."
"Wanna bet? You see how this company treats IT"
"Okay, but I won't be alive then"
"But your children will be maintaining your code! Think of the children!"
 
2012-12-10 01:54:13 PM  

Happy Hours: I guess the line numbering n BASIC really farks with copy/paste


Not all versions of BASIC use line numbers. Microsoft BASIC for the Amiga used labels like you use in Assembly, so you'd have "goto here:" and "gosub there:" commands sprinkled in your code.
 
2012-12-10 02:02:24 PM  

farkeruk: Sure, it means you are basically stuck on Microsoft kit, but for me, that's a small price to pay for all the goodies I get.


As long as you don't call Windows specific DLLs, you can supposedly get a lot of C♯/dotNet programs to work with Mono under BSD and Linux.
 
2012-12-10 02:25:47 PM  
I'm on my last 2 weeks of Intro to Java for my pre-reqs, before I move into my first year of CS. I finished Intro to C++ last semester. Needless to say, the comments have definitely been eye-opening. Thanks, guys.
 
2012-12-10 03:49:30 PM  

un4gvn666: I'm on my last 2 weeks of Intro to Java for my pre-reqs, before I move into my first year of CS. I finished Intro to C++ last semester. Needless to say, the comments have definitely been eye-opening. Thanks, guys.


Keep one thing in mind as you roll down the CS path. If your school's anything like my alma mater, the CS guys will look down on the Management Information Systems guys in the College of Business, for a variety of reasons.

Just remember, once you graduate, if you want the latest and greatest toys, there's something to be said for having someone around who's a geek, but who can also make the business case for the latest and greatest tech. Someone who can interpret code, and also speak MBA.
 
2012-12-10 04:07:14 PM  
True story....

A department had a series of 'weekly reports' they'd generate and send up to the senior staff. It started off as a small Excel file, but over the years got bigger and bigger. Eventually it took two people most of every Monday to generate this Excel file. They did everything by hand and they always seemed to screw up something (that always went unnoticed).

They had an intern come in who had done a vague idea of what VBA was. The kid sat down with them and wrote some really, really ugly VBA macros that did the same thing. He worked on it for the better part of a month. His script ran in 15 minutes and took over the keyboard, but in 15 minutes, the report was done and it was always done correctly.

Fast forward 8 months. They wanted a change made. The intern was gone so they sent in an official development task (like I said, big company so everything needed to be a formal request to be approved and prioritized). The internal development team cited 10 standards that the VBA script didn't comply with. The code was copy/pasted/hacked together. That wasn't good enough. They were going to design a PROPER system. Lord knows, it's not like anyone has ever sat down and decided to make a 'reporting system' before.

They spent months, many months, designing this system. It was going to be the end-all, be-all of reporting (again, what an original idea!). I might be wrong on the details, but I think it was an SQL database they put together to populate the data, then an application (I think in VB) that would parse the data feed into the database, then they created a task in the scheduling system that would run the import job and alert them if it failed, then they wrote stored procedures to perform the calculations the Excel file used to do, then they wrote pages of Crystal Reports that would pull the data from the database.

Naturally, they made it so the reports looked nothing like the original report; and the higher-ups didn't like it. And bugs and mistakes were made along the ways, and releases were pushed out and sent to the QA department and all that jazz. Good times.

Eventually, they had it all working, including the small change the intern could have done in 4 hours. It took them over a year and endless man hours. But it was done 'right' by real 'Software Engineers'. And life was good. The new system had some problems - before if the CSV file showed up late, they'd just start the report late. Now they had to get someone with access to the job scheduling system to kick off the import job after it's initial attempt had timed out. And, it was fairly common for there to be bad data in the data file, instead of '71.81' they might get a value of '7181'; before they'd just fix it. But now they couldn't. If they knew SQL (and none of them did) they could have updated it in the database, but, 'officially' they needed to submit a trouble ticket to the DBA team who could update it on their behalf.....but that could easily take 4-5 hours before someone would look at it.

But okay, still, it was a working a system.

Then, one day, for reasons unknown to me, the senior VP asked for the .xls file. I guess he'd done it years ago and wanted to see the original Excel file and do some work with the numbers. Nobody had mentioned that the new system wouldn't generate a finished, formatted Excel file anymore. And that's what he wanted.

And even though it was months since he could do that, once he knew he couldn't, it was a high priority issue. I kid you not, the development team said, 'Oh yeah, we can do that' and started working on a batch job that would run after the import job that would pull data from the SQL database and use some Excel interop library to generate an Excel file.

I've never tried to create an Excel file from inside VB, but I guess spitting out data was easy enough. Even though they'd calculated all the values inside the stored procedures, the VP wanted to see the formulas in the Excel file - so they actually re-created all of the logic in the Excel file! Again! Only now it was in a VB program that created the Excel file.

What wasn't easy enough was getting all the cells colored the correct way and alignments and resizing and all the crap the ladies used to do by hand each Monday morning. I guess they got 'most' of it; but couldn't do all of it and ended up using a VBA macro to do it. Seriously. So, in the end.....

They had a task scheduling system that would kick off the data import VB application that would feed the data into and SQL database that the Crystal Report client would use to generate the report with the data from the SQL stored procedures that would be compared against the Excel file that the Excel file creation tool would generate. Then, when the numbers matched (and they always matched since they were generated the same way) someone would run the VBA formatting macro on the Excel file so they could include the Excel file along with the near identical looking Crystal Report generated PDF (I think it was PDF).

That was a very depressing place to work. But hey, at least we didn't have any cut-and-paste code cowboys doing simple things effectively.
 
2012-12-10 04:08:26 PM  
tl;dr - people suck
 
2012-12-10 04:10:39 PM  
I guess I am a little late to the party, but dear god no, that is horrible advice. Most peoples programming would be infinitely better if they simply disallowed copy and paste. If you ever find yourself copying and pasting code (either yours or someone else's), stop and rethink the situation. Maybe you should put the code into a function to avoid code duplication. Maybe create a library, if it is some one else's code. Consider abstraction or inlining, or any other number of constructs before blindly adding copypasta to code.

More often than not, copying and pasting leads to the propagation of bad or erroneous code. Or using code that is not fully understood.
 
2012-12-10 04:12:52 PM  

EngineerAtWork: I guess I am a little late to the party, but dear god no, that is horrible advice. Most peoples programming would be infinitely better if they simply disallowed copy and paste. If you ever find yourself copying and pasting code (either yours or someone else's), stop and rethink the situation. Maybe you should put the code into a function to avoid code duplication. Maybe create a library, if it is some one else's code. Consider abstraction or inlining, or any other number of constructs before blindly adding copypasta to code.

More often than not, copying and pasting leads to the propagation of bad or erroneous code. Or using code that is not fully understood.


what if I'm copying and pasting my function?
 
2012-12-10 04:16:35 PM  
Cybernetic
I've known some programmers who are VERY good at SQL, and have a really deep understanding of things like query optimization. When you need someone like that (usually because your database performance is crap and is bogging down the entire system), they're worth their weight in gold.


Ah, for me that's not "good at SQL", that's "good at databases" :-P

/well, ok, depending on the bottleneck, it's very likely both
 
2012-12-10 04:30:13 PM  

treesloth: DeaH: As a person who writes those requirements...

Well, then I gotta ask, then why can't the customers just take the specifications directly to the software people, huh?


I use the requirements to write your specs.
 
2012-12-10 04:35:01 PM  

redmid17: what if I'm copying and pasting my function?


The entire function, or the function call? If you are copying the entire function, I would stop and ask why. Maybe that should be encapsulated within another method.

If it is just the function call, I would still type it out. That way you could catch any errors (or not propagate errors) in the parameter list. If you are copying it because only one parameter is changing, why not determine the parameter ahead of time and call the function. Either way, copying and paste should be a warning flag. Granted, there may be time when it is appropriate. But you should always stop, take a breath, and ask yourself if there isn't a more efficient, less error prone way.

Taking a few extra minutes to code can save a few hours of debugging.
 
Displayed 50 of 175 comments

First | « | 1 | 2 | 3 | 4 | » | Last | Show all

View Voting Results: Smartest and Funniest


This thread is archived, and closed to new comments.

Continue Farking
Submit a Link »
On Twitter





In Other Media


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.

Report