what's the big deal with parallelism?

jason87x

Solid State Member
Messages
16
I was in a computer architecture class last semester, and the last chapter was about all this multiprocessor stuff. It's quite confusing and I didn't really learn it that well.

What's the big deal about it? Do multiple cores really provide the speedup they promise? Is task level parallelism a good idea, or do the separate processes need to like communicate with each other a lot?

In that textbook I also read the section about GPUs. It was mostly nvidia buzzwords and didn't explain anything very well. Sounds like a lot or proprietary voodoo that only the top maybe five computer architecture people in the world really understand. Sounds like nvidia is also way ahead of any other cpu/gpu maker, including intel amd or even the government maybe lol if they do that sort of thing.

And how does vector processing play into things? How many applications really rely on vector processing? I know a lot of this is geared at graphics and sometimes sound, but do they have much benefit to normal processing? Will it change how programming is done like in a serious way?

I guess this is kind of like the change from learning about calculus in a scalar way (Calc 1 and early Calc 2) to transitioning to a vector way of thinking (vector calculus). Which still is more complicated, I understand calculus in a scalar perspective, but I forgot a lot of those vector calc formulas, and how they relate to a scalar math world. I wish math the way it's taught was more generalized to account for vector and matrix (arbitrary R^n).
 
Do multiple cores really provide the speedup they promise?

Yes, assuming the programs are optimized for multiple cores.

Is task level parallelism a good idea, or do the separate processes need to like communicate with each other a lot?

It really depends on how things are programmed. If the program is properly optimized, it is.

In that textbook I also read the section about GPUs. It was mostly nvidia buzzwords and didn't explain anything very well. Sounds like a lot or proprietary voodoo that only the top maybe five computer architecture people in the world really understand.

Care to elaborate? I cannot really help based off this description.

Sounds like nvidia is also way ahead of any other cpu/gpu maker, including intel amd or even the government maybe lol if they do that sort of thing.

Not really. AMD/ATI and nVIDIA are pretty evenly matched, with the performance crown frequently passing back and forth. Intel graphics are fail though.

And how does vector processing play into things? How many applications really rely on vector processing? I know a lot of this is geared at graphics and sometimes sound, but do they have much benefit to normal processing?

Vector processing tends to be used in more specialized tasks as far as computing, but it is used in graphic rendering.

Will it change how programming is done like in a serious way?

Not really. I don't think there will be a time (at least in the near future) where most programs take advantage of it.

I guess this is kind of like the change from learning about calculus in a scalar way (Calc 1 and early Calc 2) to transitioning to a vector way of thinking (vector calculus). Which still is more complicated, I understand calculus in a scalar perspective, but I forgot a lot of those vector calc formulas, and how they relate to a scalar math world. I wish math the way it's taught was more generalized to account for vector and matrix (arbitrary R^n).

Can't really weigh in here. I'm still in pre-calc.
 
Agree with pretty much all that's said above - just one thing I'd offer a slightly different perspective on:

Will it change how programming is done like in a serious way?
It may well do. People now (including me) are seriously looking at writing their programs so they work concurrently, taking advantage of multiple cores - and after a lot of research no-one's found a way to say "here you are compiler, take this single threaded program and safely optimise it for x cores." The most likely thought at the moment is it just can't be done.

If this is the case then people do need to change how they program. There are languages where people have to code completely differently - it's a steep learning curve but you get the advantage that these languages really do take advantage of all the cores they can as best as they possibly can. Check out occam-pi for instance. It works by piling together a load of parallel networks and then having them all communicate on different channels, it's surprisingly intuitive when you get your head around the concepts but it still feels like a very old / clunky language to me.

The other possibility is that we see a shift to functional languages being the norm, or at least languages with more of a functional element to them (just like we saw a big shift from procedural languages being the norm to OO languages being the norm.) See Scala and F#, which are essentially like functional versions of Java and C# respectively. (This is probably why Microsoft got on the game with F#, they didn't want to be left behind!) Functional languages are inherently much better at concurrency because they operate via recursion (the purest functional languages like Haskell don't even have loops!). They do however require a complete shift in mindset from OO (most OO programmers avoid recursion like the plague unless an obvious case is required) - this may well happen but a lot of people will get upset in the process!

Personally I believe that should multi-core processors continue to grow in cores (sounds like a stupid point, but at this rate of growth it's going to be a while until we see 128 / 256 core home processors!) the above shift in paradigms will happen. However, until then the change we're most likely to see is that happening at the moment, more easy to use concurrency libraries are being added to existing languages and they're being used more often. Take Java for instance, we'll see a new fork-join framework with Java 7 allowing us to do very fine grained concurrent tasks. That's being added to the java.utl.concurrent package which was added in Java 5, and though I can't remember what they were I'm sure 6 had a new library added as well. You have to code differently to use these, but it's generally not too hard to grasp while still offering the performance benefits you need.

That said, the reason a lot of applications are just coded to use one core is that they aren't power-hungry enough to warrant using other ones. Take Quelea for instance (I released it yesterday so it's in my head at the moment!) I wrote it from scratch in Java, and there's not a single concurrency library in sight. Yet. But the worst it's doing is a bit of Image IO (literally reading an image from a file.) And the killer factor in that is likely to be disk speed rather than anything else. So quite frankly, what's the point of spending twice as long writing it to split it over 4 cores? In the future though I'm planning to have it support video, if that's the case then yes I'll be seriously looking at multithreading it.

However, an application I wrote last year for a company was a server side beast that ran on a server with 4 rather powerful cores and required processing huge amounts of text to extract useful information from. In that case, yes I did multithread it, and doing so reduced the time it took dramatically.

So the answer? Yes, programming will change and to a certain extent it already has. How it changes though is the big question - will people continue to use languages they know and love and just use concurrent libraries as best they can, or will they go out their way to learn a new language better suited to the task and adopt the potentially steep learning curve? As a side note, here's a tip for anyone looking to graduate in the next few years (but only when you've learnt another language to a decent level already!) - learn a functional language like F#, Erlang or Scala (it's the concepts that matter so don't worry about what language you pick). At worst you've got another language people might want under your belt, and at best you're going to be in high demand in a few years when no-one else has the skills companies are after.
 
Back
Top Bottom