Beginning programming question

I'm going to stop here and thank everyone for their thoughts and advice. As I expected, these were among the most open-minded and thoughtful suggestions I've seen anywhere on the Web. When I see similar questions asked elsewhere, they almost always descend, as Tom suggested, into a kind of religious war.

For the record, my son did use Maple in college, and he has access to it now, at work. The commercial statistics programs seem to cover the math he needs in this work, however, so he doesn't have much need for a Mathematica-type program now. He probably will when he starts his master's degree program but I'm sure he'll have access to it at school.

To clarify, he does analyses of health care programs -- private, Medicare, and Medicaid -- as a component of reports and higher analyses that go mostly to federal agencies and Congressional committees. (This is a non-partisan policy institute/think tank, so they're working on contracts issued by the government and by insurance companies, not on lobbying projects.) Most of the data he digs up is in the form of Excel spreadsheets and databases, often with many thousands of records. He runs into the same issue that I frequently encounter in things we discuss here, and in my article research: Most of the data available in this world was prepared for some purpose other than the one you have for it, so it has to be filtered, reorganized, normalized, etc., before it can be used. Since much of it is collected by state agencies, he often has to combine 50 different data sources into one file, and they're all different.

He does the arithmetic parts of that in Excel. Then he imports it (usually) into SAS, where he applies higher statistical methods. The trick to making this work well, aside from having good ideas and insights about how to normalize and adjust the raw data to produce the value you're looking for, is to automate as many tasks as you can.

He's using Windows products, which often have to run all night to give him a result. There are some minicomputers in the place running on Unix, but they're reserved for other kinds of computational tasks.

I'm not going to pre-judge for him which way he should go with this. I'm compiling your suggestions for his consideration. I expect that, in the end, he'll be influenced by the programmers at work and what they encourage him to use. But his own learning needs are a part of it, too.

Thanks again. You've all been very helpful.

Reply to
Ed Huntress
Loading thread data ...

It is more like a missing link between assembly language and a high-level language.

Things like C++ have too many layers of abstraction, and result in massively bloated programs, so I tend to avoid that.

Given his intended use, I think that C (for all that I use it a lot) is not the best choice. Among old languages, FORTRAN has massive math libraries which could help.

Or for something somewhat newer and *very* math focused, APL is a likely choice -- though it does use a weird characterset to represent all the math operations.

It is infamous for being a write-only language. You can do amazing things in a one line program, and weeks later not be able to figure out how you did it -- but the program still works.

I've not taken the time to learn it, however -- I'm not that strong in math.

If he were interested in artificial intelligence, the best language is probably lisp -- or at least used to be.

Pascal is a good language to start with, actually, because it makes it very difficult to write poorly-structured programs. However, most implementations of it also make it rather difficult to make complex programs which deal with strings a lot. (I wrote a membership database program in it when I was learning it, and when I ported the basics of that program to C, it was *much* easier.

BTW -- with linux systems, you can usually get gcc (GNU C Compiler) which also includes A couple of versions of FORTRAN and possibly even ADA (A language written for the DOD patterned after Pascal, but designed for writing serious application programs, not for teaching as Pascal was.)

However what you *don't* get with that Fortran is the ton of math libs -- which are usually sold to mainframe users at serious prices. You'll get a reasonable subset, but nothing like the massive collection which is out there in the mainframe world.

Enjoy, DoN.

Reply to
DoN. Nichols

Thanks for the tips, Don. FWIW, Fortran is what I learned in college. Actually, I took a course, but never really learned it, because we had exactly two computers on campus, both of which were IBM 360s. I think it was Tuesdays that I had to turn in my punch cards at the computer center. On Monday, I'd get my output -- which almost always had an error or two.

You could go a month or more getting one program to run right. That meant four tries. It's a wonder we learned anything then, huh?

Reply to
Ed Huntress

C isn't an exactly subset of C++. It comes pretty close, and it comes closer yet if the C code in question is written with good style. Certainly if you say "All C++ compilers will happily compile C that is _well written_ to _modern coding standards_", then you'll be much closer to the mark.

Here's the story from the Stroustrop's mouth, with examples:

formatting link

Reply to
Tim Wescott

I remember those days. I took it again many years later and was able to use a compiler that ran on a PC.

Reply to
ATP

...

I may have been one of the first computer hackers. I noticed a Doctoral candidate submit a stack of punch cards, two full boxes, to the computer operator for the IBM 360. I was punching up a large stack for my GPSS (general purpose simulation system) class. That person came back in only two hours!

Next day, I struck up a conversation and asked to see her header cards, I must have something wrong. Anyway, started using those mysterious codes and getting my runs back in two hours. I aced that class and was even recognized by the prof. as writing some of the best code he'd seen. Karl

Reply to
Karl Townsend

Ed, that reminds me of my experience with Fortran. I was working for a serv= ice bureau in Portland, Oregon. I had taken a Fortran class at the communit= y college, so was given the task of converting IBM 1130 Fortran to IBM 360 = Fortran to run on a model 40 machine.=20

Huge difference in the versions and the way each handled the instructions. = The customer was using the programs to schedule the loading of logs into sh= ips for shipment to Asia. Had to load properly to keep from upsetting the s= hip's balance in the water.

Finally got the reports to satisfy the customer, but when the 360 would pri= nt a line on the printer, all the lights on the console would turn on for a= bout 3 seconds, then one line would print! The Model 40 ran DOS with three = partitions. All would stop while the line of print was assembled. What a PO= S! that Fortran was when it came to I-O.

We never took on another Fortran conversion and never had another customer = try to use it.

Paul

Reply to
KD7HB

:) Perl is the most awesome "git-r-done" language, it gets a job done quickly, has facilities to do stuff robustly. I can still easily read my perl stuff from years ago.

i
Reply to
Ignoramus20691

By the way, I bought a book for my 9 year old son to teach him programming. It is called "Hello World!" and it uses Python.

So far, he seems to like it.

So, I need to learn Python too, any suggestions for a good Python book for programmers. Something that would not explain in depth what is an if statement, just would explain how to use one.

thanks

i
Reply to
Ignoramus20691

rvice bureau in Portland, Oregon. I had taken a Fortran class at the commun= ity college, so was given the task of converting IBM 1130 Fortran to IBM 36=

0 Fortran to run on a model 40 machine.

. The customer was using the programs to schedule the loading of logs into = ships for shipment to Asia. Had to load properly to keep from upsetting the= ship's balance in the water.

rint a line on the printer, all the lights on the console would turn on for= about 3 seconds, then one line would print! The Model 40 ran DOS with thre= e partitions. All would stop while the line of print was assembled. What a = POS! that Fortran was when it came to I-O.

r try to use it.

Seriously Paul? Three seconds to print a single line? There MUST have been something else wrong. I had Fortran programs generate boxes of scrap paper pretty damned quickly on a 360.

When I asked a neighbor's kid (three years old at the time) what her father did at work, she said, "I think he makes paper for me to draw on." Her dad was a programmer on 360s, and he brought home boxes of scrap.

Reply to
rangerssuck

Ed, is this about learning about programming? or wrangling computers?

In the Windows environment, for actual programming, ASM - for me, still. VBA (Visual Basic for Applications) capabilities and Visual BASIC programming. (macro language for all MS apps)

May I offer that learning CAD would provide more personal range of operation that programming skills do not provide.

Reply to
CaveLamb

Agreed.

Reply to
CaveLamb

Well, that was a LONG time ago. Certainly more than 40 years! Maybe it just= seemed like 3 seconds. I know no other programs performed that way.

Paper? Yes, I bought home boxes of paper, to. Gave away to friends and neig= hbors. Still was creating boxes of paper up into the 1990's. I still have s= ome very large program listings that I am proud of.

One is a 360 assembly program that used direct calls to DOS for IO function= s. I was trying to access the console typewriter and other devices. Another= program listing is for a Burroughs computer that let me edit and compile p= rograms and control the computer remotely from a terminal. All the other pr= ogrammers were still using punched cards. Now days, it's hard to figure out= how the programs worked and what the code is doing.

Paul.

Reply to
KD7HB

He's headed for a PhD in economics, with a master's in math along the way. His responsibilities at work are evolving. But his interest now is in learning basic programming, with an eye to learning something useful for his work. He doesn't really know what he's going to do with it. He does NOT want to be a professional programmer, although knowing the basics probably will help him.

I think he's picking up some VBA in his work with Excel. I recommended it to him last summer, when he was beginning to require lots of macros to automate Excel processes.

He has little interest. He watched me fiddling with it for years, when I wrote the CAD/CAM columns for a couple of manufacturing magazines, and later, when Ashlar was my client.

Reply to
Ed Huntress

Heh, heh....I wasn't devious enough to do things like that. At least, not then. d8-)

That whole experience turned me off about programming. It wasn't until I got my first Apple II+ and started programming in BASIC that I regained any interest.

Then I got a RS M100 laptop and really started to have fun. I wrote a merchandise-distribution program for my wife -- she was a fashion buyer for

26 retail stores -- in assembly, which saved her and her staff 30 hours or so per week. So the company's IT department asked if they could convert it for use on their minicomputers. I said sure, and gave it to them.

But I had written it for the 80C85, using all of the commands, and they couldn't make head nor tail out of it. My wife continued to run it on the RS M100.

Reply to
Ed Huntress

Google 'Python tutorial'.

Have Fun! Rich

Reply to
Rich Grise

A real programmer learns algorithms, translating those into workable code then becomes the task. Knuth's set of books on the art of computer programming is a rather dense start to learning different ways of doing things in the computer world, there are other books. Or he could take some elementary college level courses. It's too bad most of the thrust in training these days is on a specific language rather than general knowlege, I've translated the same basic concepts into a half-dozen or more different languages and assemblers. There's nothing as constant as change in the computer world, if you know the underlying structure you want to translate, picking up a language to do it is kind of minor. Right now, the hot stuff is .NET, Java, maybe Python for beginners, along with a lot of web-oriented stuff. 10 years from now it'll be a bunch of things totally different.

If dealing with a lot of data, learning database techniques can be a plus. You can only go so far with Excel and the like. SQL is the key here.

Stan

Reply to
stans4

Quick programs, bigger than a little scripting, would favor Python and maybe APL (Aplus is an available free variant). If the programs start to get big, an IDE (integrated development environment) and a C or variant (C++) program language are suitable. If his applied math program is any good, it'll cover this decision, though; there'll be requirements.

Java and VB are owner-controlled, which scares me. I've done years of work and then had the underlying language changed or withdrawn.

Reply to
whit3rd

I made my living programming fortran for some years, back in the 1970s. IBM fortran was pretty good, but was industrial-strength complex. (But simpler than PL/1, but that's a whole other story.) I/O can be pretty complex. I never understood fortran I/O until some years later when I took Assembly Language, and saw how the kernel I/O services really worked. Only then did it become apparent how the fortran I/O must work.

Your boss blew it. He should have found someone who really knew fortran and its dialects, especially IBM's. One community college course isn't nearly enough, and it's no surprise that you couldn't figure it out having had only that one entry-level course.

Joe Gwinn

Reply to
Joseph Gwinn

While this is true, it's far too deep in the details to matter in the present discussions.

In a sense, and to overstate the case, it's a bit like saying that US English and UK English are not quite identical, so let's learn Chinese instead. (Only to later discover that Chinese has hundreds of dialects.)

The newer C and/or C/C++ compilers tend to be more rigorous about enforcing the rules of the language than in the 1970s, which means that perfectly functional old code is often rejected by new compilers. There is a big debate in the programming community on if this is a good thing or a bad thing. I'm in the just-fix-the-errors-and-get-on-with-it school, but if the old codebase is valuable and huge, people just disable the warnings. I think all modern compilers have a single switch to return to the laxity of yore.

Joe Gwinn

Reply to
Joseph Gwinn

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.