F2 Navigation   |  Options  |  Login  |  Register  |  Status  |  FAQ  |  Upload Pics  |  Work-Safe Bookmark and Share


Back to Index | Reply to Topic

Page: 1 2 3  | Previous  Next | First  Last

  CMU removed OOP from its CS curriculum because it's "unsuitable for a modern CS curriculum."
.
Unregistered

. said:
Asok said:No, they don't. Ivory tower retards.



Exactly. There is academia, and then there is the real world with real needs.


The "real world" is subject to change. Education is not about training you in the details of the real world as it exists today. It's about enabling you to play a role in shaping the future - i.e., what today's real world will evolve into.

Today's real world is: a bunch of spreadsheets hacked together with VB6 at some small business; a massive COBOL-based internal application at some big bank; etc etc. Today's real world is the way it is not because that's the best way to do things. Eventually everything will change - some things more quickly than others.
.
Unregistered

shockeye said:Great move, can't wait to see java killed even as a teaching language.



Java is a fantastic language.

Banking, servers, appliances, internet, you name it, run on Java.

Having said that, I could do all of that with Lisp with thousands upon thousands less lines of code, but there is absolutely no industry support for functional languages outside of recent hoopla because of facebook's use of ErLang..

.
Unregistered

shockeye said:Great move, can't wait to see java killed even as a teaching language.


OOP issues aside, Java is a shit teaching language because of all the overhead and falderal you have to do to get even basic things running. In teaching core concepts like algorithms and data structures, you want a language that is as close as possible to pseudo-code (or English), such as Python.
.
Unregistered

. said:
shockeye said:Great move, can't wait to see java killed even as a teaching language.



Java is a fantastic language.

Banking, servers, appliances, internet, you name it, run on Java.

Having said that, I could do all of that with Lisp with thousands upon thousands less lines of code, but there is absolutely no industry support for functional languages outside of recent hoopla because of facebook's use of ErLang..


More banking and big-industry apps run on COBOL.
.
Unregistered

The first and most important thing to understand about CS is that the machine is totally superfluous. Everything can be done with a pen and paper. Trevs tend to think there's some kind of "magic" in the machine. That's why they think some day machines will "become intelligent" or that we could live inside the machine "as a simulation" (which is every bit as retarded as saying I could live on a piece of paper as a description). Genuinely understanding computation means having such silly notions knocked out of you. Dijkstra had it right: first year CS students shouldn't be allowed to touch a computer.
.
Unregistered

. said:The "real world" is subject to change. Education is not about training you in the details of the real world as it exists today. It's about enabling you to play a role in shaping the future - i.e., what today's real world will evolve into.

Today's real world is: a bunch of spreadsheets hacked together with VB6 at some small business; a massive COBOL-based internal application at some big bank; etc etc. Today's real world is the way it is not because that's the best way to do things. Eventually everything will change - some things more quickly than others.



You're arguing against points that I never made. Good for you though, "the world changes", where did you glean that life changing knowledge?


.
Unregistered

. said:More banking and big-industry apps run on COBOL.




So? This doesn't change anything that I said.
.
Unregistered

. said:The first and most important thing to understand about CS is that the machine is totally superfluous. Everything can be done with a pen and paper. Trevs tend to think there's some kind of "magic" in the machine. That's why they think some day machines will "become intelligent" or that we could live inside the machine "as a simulation" (which is every bit as retarded as saying I could live on a piece of paper as a description). Genuinely understanding computation means having such silly notions knocked out of you. Dijkstra had it right: first year CS students shouldn't be allowed to touch a computer.



:facepalm:

db
fool's gold prospector

2070 posts

Read the fine print, they're still teaching OOP.

First year: imperative programming (C), functional programming and data structures and algorithms to knock their socks off

Second year:
Yes, now we'll take a moment to learn... what is it, "Jah-Vah", it's quite amusing
\
:snob:
.
Unregistered

db said:Read the fine print, they're still teaching OOP.

First year: imperative programming (C), functional programming and data structures and algorithms to knock their socks off

Second year:
Yes, now we'll take a moment to learn... what is it, "Jah-Vah", it's quite amusing
\
:snob:


It's optional.
.
Unregistered

. said:The first and most important thing to understand about CS is that the machine is totally superfluous. Everything can be done with a pen and paper. Trevs tend to think there's some kind of "magic" in the machine. That's why they think some day machines will "become intelligent" or that we could live inside the machine "as a simulation" (which is every bit as retarded as saying I could live on a piece of paper as a description). Genuinely understanding computation means having such silly notions knocked out of you. Dijkstra had it right: first year CS students shouldn't be allowed to touch a computer.


What physical property makes brains different from silicon-based computers as a substrate underlying intelligence?
.
Unregistered

db said:Read the fine print, they're still teaching OOP.

First year: imperative programming (C), functional programming and data structures and algorithms to knock their socks off

Second year:
Yes, now we'll take a moment to learn... what is it, "Jah-Vah", it's quite amusing
\
:snob:



I'm ok with this curriculum. One class to introduce object oriented technologies would be worthwhile, simply as an elective.

There will always be uses for imperative and deterministic languages for real time applications.

The newer languages will need to cope with extremely multicore heavy hardware, making contextual switching less painful and parallelism seamless.

Then there will be the move to functional languages for general purpose development in web, applications and so forth.

Finally, legacy support for java and c++ code will continue for decades.

Everything has its place, everything has its use.
.
Unregistered

. said:What physical property makes brains different from silicon-based computers as a substrate underlying intelligence?



It's not a question of a physical difference. Silicon-based computers are designed to a particular specification and it's only because they are so designed and because that specification exists that we can say they're computers. There's absolutely no sense in which the brain can similarly be said to be a computer. There's no specification we can point to that would allow us to interpret something the brain does as a computation. The entire notion is absurd because a thing's being a computation is a matter of definition. We can only use a computer to, say, add two numbers together because we have defined certain aspects of its operation as representing the numbers, the addition, the result, etc.
shockeye
Robotron Rescue Llama

1521 posts

. said:What physical property makes brains different from silicon-based computers as a substrate underlying intelligence?



Density, bussing, and switching.
.
Unregistered

. said:
. said:What physical property makes brains different from silicon-based computers as a substrate underlying intelligence?



It's not a question of a physical difference. Silicon-based computers are designed to a particular specification and it's only because they are so designed and because that specification exists that we can say they're computers. There's absolutely no sense in which the brain can similarly be said to be a computer. There's no specification we can point to that would allow us to interpret something the brain does as a computation. The entire notion is absurd because a thing's being a computation is a matter of definition. We can only use a computer to, say, add two numbers together because we have defined certain aspects of its operation as representing the numbers, the addition, the result, etc.


If the human brain does not engage in computation, then how does it produce the apparent outcomes of computation? Magic?
.
Unregistered

. said:If the human brain does not engage in computation, then how does it produce the apparent outcomes of computation? Magic?



What "apparent outcomes of computation"?
.
Unregistered

shockeye said:
. said:What physical property makes brains different from silicon-based computers as a substrate underlying intelligence?



Density, bussing, and switching.


I don't disagree that there may be something special about brains - something in the physical makeup - which allows it to support consciousness which silicon-based computers lack. However, as of now, nobody knows what that is - or even if it exists at all.
.
Unregistered

. said:
. said:If the human brain does not engage in computation, then how does it produce the apparent outcomes of computation? Magic?



What "apparent outcomes of computation"?


Say I ask you to add 2314 and 27 in your head, and you reply "2341". What is that if not a computation?
.
Unregistered

. said:Say I ask you to add 2314 and 27 in your head, and you reply "2341". What is that if not a computation?



It's a computation but the assumption that the brain performs the computation is wrong. The problem is actually with the assumption that a computer can add two numbers together. As I said before, it cannot. We use a computer to add two numbers together. A computer (in the sense of a device) is a tool for helping us complete computations. So the fact that we can perform computations without a computer doesn't imply that there's a computer inside our head. We always perform the computations.
db
fool's gold prospector

2070 posts

. said:I'm ok with this curriculum. One class to introduce object oriented technologies would be worthwhile, simply as an elective.

There will always be uses for imperative and deterministic languages for real time applications.

The newer languages will need to cope with extremely multicore heavy hardware, making contextual switching less painful and parallelism seamless.

Then there will be the move to functional languages for general purpose development in web, applications and so forth.

Finally, legacy support for java and c++ code will continue for decades.

Everything has its place, everything has its use.



There'll always be threads and semaphores. The language of choice isn't the only issue, if the parallel application isn't seamlessly parallel it may be the OS's fault or the VM's fault or the programmer's fault.
.
Unregistered

. said:
. said:Say I ask you to add 2314 and 27 in your head, and you reply "2341". What is that if not a computation?



It's a computation but the assumption that the brain performs the computation is wrong. The problem is actually with the assumption that a computer can add two numbers together. As I said before, it cannot. We use a computer to add two numbers together. A computer (in the sense of a device) is a tool for helping us complete computations. So the fact that we can perform computations without a computer doesn't imply that there's a computer inside our head. We always perform the computations.


Again, you are begging the question.

You are assuming that only humans (and other animals?) have the capacity to "interpret" the output of a computational process. What specifically gives us this special ability? What specifically do other complex arrangements of matter lack?
.
Unregistered

. said:Again, you are begging the question.

You are assuming that only humans (and other animals?) have the capacity to "interpret" the output of a computational process. What specifically gives us this special ability? What specifically do other complex arrangements of matter lack?



It's definitional. There is no computation except what we define, so it inherently "belongs" to us. Here's an analogy: An intelligent alien species communicates in a language that uses piles of stones. They visit Earth one day and the President takes them to the beach. The stones on this particular beach, through pure chance, happen to have fallen into arrangements that appear vulgar to the aliens. They're absolutely appalled by the foul language strewn all around them and it sparks an interstellar war. Why do only the aliens have the capacity to see the insults in the piles of stones?
.
Unregistered

. said:
. said:Again, you are begging the question.

You are assuming that only humans (and other animals?) have the capacity to "interpret" the output of a computational process. What specifically gives us this special ability? What specifically do other complex arrangements of matter lack?



It's definitional. There is no computation except what we define, so it inherently "belongs" to us. Here's an analogy: An intelligent alien species communicates in a language that uses piles of stones. They visit Earth one day and the President takes them to the beach. The stones on this particular beach, through pure chance, happen to have fallen into arrangements that appear vulgar to the aliens. They're absolutely appalled by the foul language strewn all around them and it sparks an interstellar war. Why do only the aliens have the capacity to see the insults in the piles of stones?


In other words, meaning is relational. Which of course it is. The same sequence of sounds can mean different things in different languages, for example. But just because humans assign certain meanings to a computer's output does not entail that the computer itself is incapable of interpretation. The point you are making is really orthogonal to the issue of whether machines can think, as it applies equally to any situation in which the mind interprets an aspect of its environment - whether that aspect is man or machine.
.
Unregistered

. said:Modular organization of code != OOP.

In fact, they are at cross purposes.



Shut the fuck up and let people who are programmers talk.
.
Unregistered

. said:
. said:Modular organization of code != OOP.

In fact, they are at cross purposes.



Shut the fuck up and let people who are programmers talk.


:rolleyes: If you think OOP is modular, you're a moron.

Page: 1 2 3  | Previous  Next | First  Last

Back to Index | Reply to Topic

 
Quick Reply
Moniker:
Message: