Fine, go explain me lvalue and rvalue semantics. Go. You will be confronted very fast with too many advanced topics. Even the STL requires a wide range of knowledge. It does not make any sense at all to learn that as a first programming language. Thinking about how you structure your data, what your data is, etc is so very important. If you just start with some arbitrary language without that you will in the best case turn into a stupid CRUD money. That is what most programmers are. I myself have wasted too much time like that, and I am regretting it very much.
I'd say that you shoul learn some language first so you can learn the basics first. Then you can go deep into whatever language. It's like trying to learn hammer, you'd need to know how to place brushes before you can do lighting.
I tried learning Scheme, but I didn't want to bother trying to find an interpreter that could easily run on Windows that doesn't use Emacs. Still, I think learning something like C/C++ is important. I feel like something like Scheme teaches you how to program in a theoretical sense, teaching about algorithmic complexity and the mathematics of programming, while C/C++ really helps to teach how a computer actually works and how to program in a more practical sense. Pointers, arrays, memory management and so forth. If I could find a simple Scheme interpreter, then I'd learn Scheme on my own while learning C++ in my university classes.
This is what we used at UW: http://racket-lang.org/ With the DrRacket editor: http://docs.racket-lang.org/drracket/index.html
I've taken a class in Java my junior year in high school. However my teacher made it clear that it would be much easier to learn Java if you had learned C++ first.
Then realize that you didn't need to learn pointers? Go do assembly if you want to learn about memory management. C# and Java are mostly identical, but Java if you are looking for a programming job. If you want to enjoy yourself while programming, try Ruby.
A lot of interview questions are data structures and string manipulation that you need to know pointers to solve.
I was assuming the job involved Java. A Java programming test involving pointers would be retarded. If it is C#, you'll sometimes run into some C/C++ functionality.
You're a hopeless case. I learned Java when I was ~14. C# 1.0 is only marginally different than Java. C# 2.0 adds generics. C# 3.0 adds monads (LINQ). 4.0 or 5.0 adds async. Your whole "dilemma" shows how fucking lazy you are. Your comment about Haskell (it would be the same about Ocaml, Scala, Lisp and many more) shows how little you know. Normal programmer can switch from Java to C# if he hasn't ever heard of C# within 2 weeks time. Switching to whole another paradigm takes a lot longer and it's really funny to see you downplaying it just because it'd take real effort to learn. BTW most popular and modern variant of Lisp is called JavaScript.
Yes, real Java programmers never use JNI. In the mythical and isolated Java land. The reality is that questions about pointers require abstract, precise thinking. I can ask really short question about programming problem that requires use of pointers and you'll have to work quite a bit to solve it properly.
In Haskell you have real lambdas: {code} player_join = \sender e -> putStrLn "Hello" {code} OO doesn't exist there but OO is a disease in itself. HL2 uses interfaces like IClientRenderable and so on instead of pure braindead OO. Also C++ is a disease. Its original sin is being created by Stroustrup in retarded way. Then lots of shit got added to it. There's lots of shit added to it regularly. There also lots of autistic-like retards that preach "true C++" (because language allows you to write same thing in so many different ways). They grow up when they have to deal with real software but that's also when they shut the fuck up. Last of all event-driven code is the new goto. If you aren't (I don't care about feelings) familiar with at least one language then you don't land the job. If aren't familiar with data structures enough not to write braindead code then you get thrown out one way or the other. My interview question was quite simple. Write function in C that reverses order of letters in a sentence (without changing order of words). I wouldn't trust anyone who can't do that given 30 minutes and a whiteboard.
Fuck JIRA and its story points. Also I can't answer the C question but then again I'm not a C programmer so it's not a surprise. That doesn't mean I'm incompetant though, it just means I know other languages. So many replies in this thread is about programming 'enlightenment' saying that you need to know pointers and memory management. I don't see the problem in picking any language that can handle the application requirements as long as it does it well (Ruby/Clojure/Go/Dart/JS/ASP.NET/Python for web; Java/C#/C/C++/VB.NET/Rust for desktop applications; Python/Lua for short tasks; all them fancy maths languages for genetic algorithms and machine learning). People seem to defend their languages a lot and after 5 years of intensively following whatever comes out of San Fransico I've come to the conclusion that most people on the internet are vocal idiot minorities (mootant excepted). Be creative and try and make something, learn by doing not by picking a language and studying it. If you're applying for a web development job then who the fuck cares about pointers. If you're making a game engine then ok, you might want to learn it.
Well, this thread was originally about what you should do to prepare for a CS study, not what is needed to get a job as coder.
It was meant to be written in C because I was interviewing for C++ position but you could write it in Java/C#/C++ or whatever. Instead of pointer meaning int *ptr or void *ptr I was simply using int indices as pointers. After all, C/C++ pointers are just indices that are multiplied by size of 1 element and counted as offset from the start of whatever.
That I want to learn one at a time? Sorry but I'm not the type of person to learn two languages at a time. However the results come out, I learn the runner up language after, so in this case it seems C# first, Java second. The age you learned something is irrelevant.
FALSE The human brain is analogous to a city. Sections created many years ago tend to become old, condemned, and then torn down or become ghettos; if they are not used and maintained. The same can be said of your memory. Then there is the time period they were learned. Things learned in school will almost certainly be forgotten quickly if they were not interesting to you(although that may be of anything learned really).
Yes yes, I understand how synapses work. Every time a synapse is stimulated it is reinforced and synapses which are not stimulated end up decomposing or whatever, and around 12 years old is the time when your frontal lobe under goes a stage of development which last until you are 18 which is when you are able to make proper decisions. What I meant by age is irrelevant is that I don't give a shit if he learned Java at 14 years old, that is, whatever he has to say to me about that is irrelevant to myself.