Cookie Notice

As far as I know, and as far as I remember, nothing in this page does anything with Cookies.

2012/11/25

Radical Change in Higher Education - CS for Everyone!

I was pointed to a blog post titled Radical Change in Higher Education: Money, Sports and Computers. The author presents a distrust in Massive Open Online Courses (MOOCs) like Udacity, then goes on to present her ideas for deeper, more radical change.

There are things I kinda agree with but don't forsee happening, such as the ending of college sports in the way they exist now and the creation of minor leagues in their stead. I never really thought it made sense to associate institutes built upon intellect to associate so tightly with teams built on physical prowess. The minor leagues of baseball exist because the baseball is a summer sport and college students are generally off over the summer, so collegiate sports cannot attract the same audiences. Summer break exists because summer is when people were needed back at home, and the low number of people involved in agriculture makes that no longer a design requirement, so the fundamental change I would make, trimesters, would tend to make collegiate baseball more viable and minor league baseball less viable. Ah well.

Unless I misunderstood something, I believe that my school's sports program is self-supporting. I think that, if a university's program is not self-supporting, or if there isn't something else that the school gets from it that justifies it (and for the life of me, I can't think of anything, but I've never been a team sports guy), it would be better if it got rid of it.

I see the point of "replacing" collegiate sports with fitness and wellness programs, but honestly, every personal improvement in wellness I have ever experienced has come from working with myself, not with a group. I think the group dynamic messes it up, but that might just be me.

The last point, the one I'll quote, is one I strongly agree with and disagree with.
Computer Science: CS should be required. For everyone.  Can you be a historian today without using a computer?  An artist?  A salesperson?  Anything?  Shouldn’t we aspire to turn out a new generation of educated men and women who have more than a surface knowledge of how the blasted things work, since their success in no small part will depend on that knowledge?
I hold a CS degree. I work with computers in the day and play with them at night. My day work with computers involves the use of computers in science, and I've been saying this for years: Today, all science is computer science, because the research that could've been done without computers has been done without them already. I think that the same is becoming true in other fields. Between Processing, Tod Machover's work and work with genetic algorithms in composing, there's precedent for the use of computational tools in the arts. I think you can still be a historian without using a computer much more than using it for email and word processing, but I've heard of historians making more interesting use of it. First, there's the wider dissemination of contemporaneous source material, but beyond that, many are beginning to see digitized libraries as a Big Data source, where you can graph the rise and fall of ideas and people by the number of occurrences of them in the text.

I'd throw in the idea that this goes down to the level of skilled labor. Adam Davidson writes in the New York Times magazine about the "Skills Gap", saying that machinist training starts with using tools to cut metal but quickly move on to computer-aided methods.

So, yes, I'm big in agreeing that there's value in a great many fields in embracing the computer revolution. I'm all for teaching programming to all. I'm just not sure that Computer Science is really where you want that.

Computer Science is different. Computer Science, properly considered, is a branch of mathematics built around complexity. Yes, students go into Computer Science as a step toward becoming programmers, but this means there's a great deal of knowledge they gain and never use, and a great deal of knowledge they don't gain until they get on the job and find they need it. I still feel undertrained in the use of version control, for example. Those would be better served with a program built around Software Engineering, and that term is problematic, as there is no mechanism for becoming a licensed software engineer, but are required to call yourself an engineer in other fields, and many of the greatest advances in computing come from people who have the most tenuous claim to that title. Linus Torvalds was a college student when he started developing Linux.

Consider the case of databases. I would consider that there is one set of skills where users (be they artists, historians, scientists, programmers, machinists...) might use to collect what they need, another set of skills that programmers use to effectively create the tables and data structures to be used by others, and another set of skills which programmers use to create the database engines themselves. The first set of skills are things I would wholeheartedly endorse encouraging everyone to know. The second set is a little less useful unless you're stepping up to start collecting your own data. I'm learning it a little at a time, and finding holes in my knowledge every time I create a set of tables. The third set, the skills that those at Oracle or in Microsoft's SQL Server team or involved in PostgreSQL develop, are ... I don't know enough about them to really describe. But it's more about how to make these things fast at the hardware level, so you can have several thousand transactions a second go through when under load.

Thing is, while the last category is closest, none of it is really Computer Science. I think forcing this association between computer use and Computer Science doesn't help any party involved.

5 comments:

  1. The number of college athletic departments that are self-supporting is only about 10%

    ReplyDelete
  2. Saying that CS should be required is like saying that a degree in automotive mechanics should be required because everyone needs to drive a car to do their job.

    Better computer literacy is a start, but for what value of better?

    ReplyDelete
  3. Let the record show I've never been a supporter of college athletics. First time through school, I went to one football game and one basketball game. Second time, I was dragged by my wife to one midnight practice, and I thought that was the dumbest thing.

    ReplyDelete
  4. I think that "computer user" can and should sit at a slightly higher level than "browser user". It shouldn't have to get too far into the level of analyzing core dumps and Big O notation.

    (I think, as unique distinguishers go, "Computer Science" == "Algorithm Analysis", and everything else in the CS curriculum is to give the student tools to get there. Which is a wide distraction from the point, I think.)

    I think that tools that make it easier to "mash up" web data into useful things is a first priority. For example, connecting map directions directions and the Registry of Historic Places so I may plan side-trips around a trip to someplace. There's things like Field Trip (https://play.google.com/store/apps/details?id=com.nianticproject.scout&hl=en), but that's a Google 20% project. Someone who isn't a bright Google employee should be able to put together A, B and C and find every diner with boysenberry pie and every presidential birthplace within 40 miles of a straight run from Lafayette, IN to Newport, RI. Or other geographical interests and cultural priorities.

    That's more or less Tim Berners-Lee's Semantic Web, except he puts it to the stage where our things make those associations for us. I think there has to be a few generations -- "Geeks do those things the hard way" , "Tools to do these things made available to non-geeks" , "Non-Geeks make use of said tools" , "Tools made invisible with smart tech things so Non-Geeks don't have to even think about it" -- before it is a real thing, and I think we're very early in that process.

    ReplyDelete
  5. All being said, I think the process is going forward.

    For example, consider WWII movies since the rise of the web. From 1946-1996, very few would've understood the roll of cryptography in WWII, and I think the the only mention in film is Hal Holbrook's character in Midway. Now you have U-571, Windtalkers, Enigma (which all should see) and All The Queen's Men (which I will not see) and others dealing with the subject, all made possible because encryption in web browsers made the topic more understandable to the public at large.

    Not that many people use encryption yet....

    ReplyDelete