1) Stuff I'd learn by reading Hacker News, after a few months
2) Stuff I'd learn on the job, after a few months
I'm not downplaying that it's important to know, but it's like converting a set of free blog posts into a $5000 university course.
Is that an efficient use of student time or money? Probably not. They're better off getting a summer internship, and getting paid to learn agile (and learning it better! working on an agile team will teach a lot more than listening to a lecture), or reading the anecdotes while procrastinating real homework, and posting in forums like this one.
It's okay that courses at a university do little to prepare people for some realities. Courses are designed to teach big-O, differential equations, database architectures, the math of machine learning, and similar sorts of things where there's enough theoretical depth that you CAN'T pick up elsewhere.
If a software engineer enters industry without knowing linear algebra, odds are they'll never be a data engineer in a serious ML project, or even more so, work on serious ML themselves.
TL;DR: Content looks useful, but it's an counterproductive use of tuition dollars and student time.
> If a software engineer enters industry without knowing linear algebra, odds are they'll never be a data engineer in a serious ML project, or even more so, work on serious ML themselves.
Linear algebra is a lot to learn[1], but I think it would be possible to pick up the material on the job, given enough time dedicated to it (like a month, not a week).
Similarly for pre-calculus (highs school math) and calculus.
I guess it does add up though, and if you're missing 5-6x of these fundamental building blocks and you have to fit a year's worth of math learning in between work, meetings, and other life obligations it can be tough.
is a few semesters, not a month. There is deep intuition behind the concepts, and it takes a lot of time to pick up. The core problem is there are no short-term gains. Dabbling in linear algebra is not a skillset which improves your employability, or broadens projects you can work on.
Being able to develop TensorFlow, optimize 3d rendering for NVidia, process images for Adobe, or design a control system for a robot IS an employable skillset, and makes for much more interesting work. Picking that up is why I'd send my kid to a school like CMU.
Similarly for precalc and calc. It's not hard to know what a derivative and integral is. There is mathematical depth to being able to do something interesting with that.
If college gets you that background, you can pick up Agile and test infrastructure on-the-job.
> It's not hard to know what a derivative and integral is. There is mathematical depth to being able to do something interesting with that.
That's a very good point. In some sense learning of the material at level N, is only done when you use and consolidate the knowledge at the subsequent level N+1, and maybe even N+2.
Indeed, it would be fair to say that a student passing the final exam on topic X—even with a good grade—only truly understands 30-40% of the material. Only when the student has to apply this knowledge in later courses is the understanding complete.
Perhaps the best strategy for the independent learner is to learn topic X and immediately follow up with applications of X. I mean that's what people recommend anyway, but I always thought of it as a suggestion or a nice to have, but in the light of your comment I'm thinking maybe it's a requirement.
University is important for foundations. You can learn MySQL on the job. You can learn relational databases at university. However, vice versa does not work well. MySQL at the university is superficial, since you never experience it in a realistic production scenario. Relational database foundations have no immediate use, so why bother with it instead of delivering customer value.
1) Stuff I'd learn by reading Hacker News, after a few months
2) Stuff I'd learn on the job, after a few months
I'm not downplaying that it's important to know, but it's like converting a set of free blog posts into a $5000 university course.
Is that an efficient use of student time or money? Probably not. They're better off getting a summer internship, and getting paid to learn agile (and learning it better! working on an agile team will teach a lot more than listening to a lecture), or reading the anecdotes while procrastinating real homework, and posting in forums like this one.
It's okay that courses at a university do little to prepare people for some realities. Courses are designed to teach big-O, differential equations, database architectures, the math of machine learning, and similar sorts of things where there's enough theoretical depth that you CAN'T pick up elsewhere.
If a software engineer enters industry without knowing linear algebra, odds are they'll never be a data engineer in a serious ML project, or even more so, work on serious ML themselves.
TL;DR: Content looks useful, but it's an counterproductive use of tuition dollars and student time.