JS-Kit/Echo comments for article at http://smallestminority.blogspot.com/2009/11/my-take-on-warmergate.html (70 comments)

  Tentative mapping of comments to original article, corrections solicited.

jsid-1259599263-616937  ben at Mon, 30 Nov 2009 16:41:03 +0000

While it is true that scientists and engineers (not including software engineers) are not software engineers, and our code "looks" an awful lot like the clunker down there, that doesn't mean it doesn't crunch the numbers correctly.

I write an awful lot of computer code, and the software guy that looks at it doesn't like the programming style etc, but he has no clue what it does or what the "right answer is" that comes from the math. The math under the hood is beyond your typical software engineer. The code just doesn't look very fancy nor does it have a compelling user interface. But it does what it's supposed to do.


jsid-1259599732-616938  Mastiff at Mon, 30 Nov 2009 16:48:52 +0000

Ben,

Have you looked at the code in question? This is not about scientists in general, but the CRU code in particular.

No, it apparently does not do what it is supposed to do. The private log of "Harry," the programmer assigned to maintain the code at one point, goes on at teeth-gnashing length about how the whole thing is a useless kluge that cannot even reproduce its own results.

Where I am studying, our computational social science department is among the world leaders in computer simulations, particularly for Agent-Based Models. If a model cannot reproduce its own results, it is fatally flawed crap.

There is a good discussion going on at ESR's blog, Armed and Dangerous, involving both sides of the question. You should check it out.


jsid-1259600055-616939  Kevin Baker at Mon, 30 Nov 2009 16:54:15 +0000

Define "crunch(ing) the numbers correctly."

That seems to be the issue with the CRU programs - they cannot replicate their own findings. They cannot explain what they did, or why they did it. They cannot even put in data they collected and run simulations that reproduce known results - e.g. Kevin Trenbeth's "The fact is we can't account for the lack of warming at the moment, and it's a travesty that we can't."

So far the code has "done what it's supposed to do" - it's produced graphs that are designed to frighten the public and the politicians into "DOING SOMETHING!"

Now we find out that the graphs are made up from data tortured to produce a desired outcome.

In the world of AGW, that's "crunching the numbers correctly."


jsid-1259600536-616940  Adam at Mon, 30 Nov 2009 17:02:16 +0000

"In the world of AGW, that's 'crunching the numbers correctly.'"

Maybe they're just asking themselves the right questions...


jsid-1259601584-616942  Unix-Jedi at Mon, 30 Nov 2009 17:19:44 +0000

http://www.dailymail.co.uk/sciencetech/article-1229857/How-16-ships-create-pollution-cars-world.html

17 supertankers can out-pollute every single car in existence. (Just 17. Hundreds are actually plying the oceanways)

Obviously, we must switch to electric cars!


jsid-1259602263-616944  perlhaqr at Mon, 30 Nov 2009 17:31:03 +0000

UJ: Man, that article hurt to read.

Even that's an underhanded design to make car fuel more expensive. If the tankers bringing the oil across have to burn the expensive stuff, instead of the cheap sludgy stuff, the expensive stuff will be... even more expensive.

*sigh*


jsid-1259603169-616945  Britt at Mon, 30 Nov 2009 17:46:09 +0000

Even that's an underhanded design to make car fuel more expensive. If the tankers bringing the oil across have to burn the expensive stuff, instead of the cheap sludgy stuff, the expensive stuff will be... even more expensive.

____________

But to the people in favor of it, that's not a bug, that's a feature!

These people are trying to wipe out the middle class. They want to be aristocrats, and they want everyone else to be peasants. That's the goal: lawyers, media types, and the right kinds of scientists and academics guiding things for the benefit of the benighted masses. The masses then work to sustain the anointed in the style to which they are entitled.

These people cannot stand the fact that the world does not compensate them to the demands of their own ego. Their sin is envy, and pride, the fatal conceit that they can run everyone's lives, but that they have a divine right to run everyone's lives. All for "the greater good" of course. Hubris, in a word.

Is this nemesis for them? Or is this one more troublesome data point to slide under the rug?


jsid-1259604029-616947  Kevin Baker at Mon, 30 Nov 2009 18:00:29 +0000

Uh, I thought that all that "lung-clogging sulphur pollution" was supposed to contribute to global cooling, and dumping it over the ocean wasn't such a bad thing.


jsid-1259604435-616949  Sarah at Mon, 30 Nov 2009 18:07:15 +0000

I can ditto what Ben said. Like poor "Harry," I use Fortran 77 for my research, and my programs are new code on top of old code with subroutines all over the place, and it more or less looks like the homegrown motorcycle. A lot of research programs in the physical sciences are like that. Why? First of all, because they work (unlike Harry's code). Second, because to rewrite the code (which already works) in a slick new way would constitute an entire thesis project, and most of us are strongly motivated to just get on with the research and beat the half-dozen other physicists who are out to scoop our results.

Define "crunch(ing) the numbers correctly."

I can't speak for Ben, but here's how I know that the code I use crunches numbers correctly. I feed it several test cases with known outcomes, and see if it returns exactly those outcomes. If it does, I assume it's working properly. I also send a subset of my data to colleagues who use completely different codes and ask them to run it through. If the results are reasonably close to mine, I assume it's working properly.


jsid-1259604537-616950  Adam at Mon, 30 Nov 2009 18:08:57 +0000

"I feed it several test cases with known outcomes, and see if it returns exactly those outcomes. If it does, I assume it's working properly."

As a software developer, all I have to say to that is, "Ouch" :)


jsid-1259605035-616951  Unix-Jedi at Mon, 30 Nov 2009 18:17:15 +0000

Kevin:

I thought that all that "lung-clogging sulphur pollution" was supposed to contribute to global cooling, and dumping it over the ocean wasn't such a bad thing.

Funny how that works, innit?

I had some data to prove it, but I threw it all away.


jsid-1259607772-616952  Matt at Mon, 30 Nov 2009 19:02:52 +0000

Vindication. Looking at the science, and the data, and what other non-politicized scientists were saying, it was clear that AGW was junk science.


jsid-1259608399-616953  Mastiff at Mon, 30 Nov 2009 19:13:19 +0000

That said, cargo ships are ripe for innovation, apparently. I recall that there is some interesting work being done with massive parasails; let me see if I can dig it up...

Ah, here we go.

Plus, why aren't cargo ships being fitted with nuclear reactors? /silly question


jsid-1259608780-616954  Kevin Baker at Mon, 30 Nov 2009 19:19:40 +0000

Mastiff, now THAT is COOL. A 3.5 year payback is pretty long, though.

And I'd be all for fitting out cargo ships with nukes. All those ex-Navy guys need work too!


jsid-1259608893-616955  Kevin Baker at Mon, 30 Nov 2009 19:21:33 +0000

I can't speak for Ben, but here's how I know that the code I use crunches numbers correctly. I feed it several test cases with known outcomes, and see if it returns exactly those outcomes. If it does, I assume it's working properly.

Yes, but Sarah, billions of dollars and reams of legislation don't depend on your software working "properly."

And at least yours gives repeatable results.


jsid-1259615614-616959  Sarah at Mon, 30 Nov 2009 21:13:34 +0000

As a software developer, all I have to say to that is, "Ouch"

Is that a good ouch or a bad ouch?

Yes, but Sarah, billions of dollars and reams of legislation don't depend on your software working "properly."

If my work had the potential to influence policy, and I had the kind of research funding the climate dips have, I'd damn well hire someone to rework the code. And I'd make my data and code publicly available.

And at least yours gives repeatable results.

When I was a grad student, my thesis advisor warned me to save all of my data and all versions of my codes, because someday I might be called on to show how I got my results. Silly me, I assumed all scientists operated this way.


jsid-1259615959-616960  ben at Mon, 30 Nov 2009 21:19:19 +0000

Kevin, I was only arguing that one need not be a "software engineer" to write good number crunching code, not that the code in question was good or bad.

That said, when I write, or re-write code, I try to use proper unit-tests where I can, so that I can check that I get expected results for known cases. Of course, I don't check for an exact match on the results, but a decent match that takes into account machine precision etc.

Looks like these fools weren't eve doing that. Not so good.

On the other hand, my original code for a certain project didn't have any known results to test against. The best I could do in this case was to test all the subroutines and libraries against results from Matlab. But the overall result was from my own algorithms... difficult to come up with a reliable test case. All subsequent code and changes are tested against the previous results that I have accepted as good.


jsid-1259616126-616961  Adam at Mon, 30 Nov 2009 21:22:06 +0000

"As a software developer, all I have to say to that is, 'Ouch'

Is that a good ouch or a bad ouch?"

Initially a bad ouch, but reading your comments after that, I'll retract it.

However, proper quality assurance and testing in programming and software development is very much in a state of "do as I say and not as I do"


jsid-1259616211-616962  Matt at Mon, 30 Nov 2009 21:23:31 +0000

I work in finance, so repeatable results is the order of the day. I better get expected results from known input cases. People tend to get a bit miffed when their account balances don't line up or money is missing from their accounts.

It is, as they say, real money. Knowledge of this tends to sharpen your attention to detail rather well. Millions (and in some cases, billions) of dollars rest upon me doing my job right. So I do.

It is tragic that such attention to detail didn't occur within the CRU with similar amounts of money at stake.


jsid-1259616470-616963  Sarah at Mon, 30 Nov 2009 21:27:50 +0000

Initially a bad ouch, but reading your comments after that, I'll retract it.

Uh, thanks. :) I'm curious to know, from a developer's point of view, why it was initially bad, though.


jsid-1259617475-616964  Adam at Mon, 30 Nov 2009 21:44:35 +0000

"I'm curious to know, from a developer's point of view, why it was initially bad, though."

From an ivory tower perspective of software? Because testing against a series of known cases isn't generally considered sufficient for something to be stable / accurate - only for the cases presented. Of course, it's generally accepted if you knew *all* of the cases you wouldn't have bugs to begin with, but hey...

From a more practical perspective, my reading of that sounded (to me, and this was clearly inaccurate) a bit like, "eh, it ran for a few things, it'll run for them all."


jsid-1259617556-616965  ben at Mon, 30 Nov 2009 21:45:56 +0000

I just read this in that article:

After the code reaches its semi-complete form, it is handed over to Quality Assurance which is staffed by drooling, befanged, malicious sociopaths who live for nothing more than to take a programmer’s greatest, most elegant code and rip it apart and possibly sexually violate it. (Yes, I’m still bitter.)


Does that bring anyone to mind, Sarah? :)


jsid-1259617613-616966  Kresh at Mon, 30 Nov 2009 21:46:53 +0000

"It is tragic that such attention to detail didn't occur within the CRU with similar amounts of money at stake."

That's why they did it the way they did. It's not about honest science, it's about the money. Always follow the money, isn't that how it goes?

Greed > Science, it appears.


jsid-1259619400-616967  Sarah at Mon, 30 Nov 2009 22:16:40 +0000

Adam,

I see. A software developer I ain't, so this is interesting.

My program is designed to make measurements of astrophysical spectra. I take several spectra and measure features by hand and also create several mock spectra with known parameters, including those with highly exaggerated or non-existent features designed to test the limits of the program. These are fed through my program, and if the results are consistent with the known parameters, I'm confident that it's working fine. That said, I go through and hand-inspect every single one of the spectra processed by my code for quality assurance.

And speaking of quality assurance...

Does that bring anyone to mind, Sarah?

ROFL! It certainly does. (Ben's referring to my husband, who is a QA engineer, and a damn good one at that.)


jsid-1259626714-616968  Russell at Tue, 01 Dec 2009 00:18:34 +0000

I heart QA.

I am a software engineer, and I run all my code through a bunch of tests to make sure it works correctly and fails gracefully.

Then I kick it over to QA and let them tear it to shreds find anything I missed either from a logic perspective or a functionality perspective. Much better for them to find it than me or worse, the customer!

The key is repeatability. If you can't run the code more than once and get the same response, whether it's a response you want or not, the code's junk.

From what I've read, the CRU program is junk.

I think that's the best basis to undermine the entire Western economic system since the *&%! proles aren't rebelling!


jsid-1259629306-616969  juris imprudent at Tue, 01 Dec 2009 01:01:46 +0000

The version on my wall for many years was the blackboard filled with a flow chart with the "Miracle occurs here" in the lower right, where many, many arrows converged. The second guy says "I think I see where the problem is".


jsid-1259630014-616970  geekWithA.45 at Tue, 01 Dec 2009 01:13:34 +0000

I understand the bad ouch.

Code that is built solely to a given set of inputs and spun through dev/QA cycles is guaranteed to render the correct verdict for those inputs...and no others.

To put in another way,

The diligent, but inexperienced interactive developer tends to navigate exactly the same paths through the operation of his program, fixing every problem he finds along the way until he proudly presents what he feels is completed work.

Then, someone who knows nothing about the program immediately breaks it, because their very first action was not the same as the developer's first action. In fact, they are the very first person in the universe to *ever* take that particular action.

The slightly more experienced developer, having bloodied his nose many times, will generally check the common sets of paths, until his other nostril is bloodied by the traitorous QA guy who used to be a coder starts feeding exotic data, and checks the results of perverse combinations of operations.

The developer who passes through to the other side of this, we deem to be Useful (r)(tm)(c)


jsid-1259632952-616971  Matt at Tue, 01 Dec 2009 02:02:32 +0000

Geek,

Or as I like to say: Good programmers figure out how to make a program work. Great programmers figure out how to make a program fail.


jsid-1259644789-616981  Mike. at Tue, 01 Dec 2009 05:19:49 +0000

When I was a grad student, my thesis advisor warned me to save all of my data and all versions of my codes, because someday I might be called on to show how I got my results. Silly me, I assumed all scientists operated this way.

Subversion is a wonderful thing. I save every version of everything.


jsid-1259649463-616985  Regolith at Tue, 01 Dec 2009 06:37:43 +0000


The slightly more experienced developer, having bloodied his nose many times, will generally check the common sets of paths, until his other nostril is bloodied by the traitorous QA guy who used to be a coder starts feeding exotic data, and checks the results of perverse combinations of operations.



Wait...does this not get regularly taught during programming courses? I can see where the self taught might not know to do this, but it seems that someone who studies computer science in college should have. I know our CS department drills this into students, but from this it sounds like we're an exception to the rule.


jsid-1259656214-616987  ben at Tue, 01 Dec 2009 08:30:14 +0000

Yes! Subversion is awesome!

On a side note... Kevin, have you noticed that Charles at LGF has gone wonky? I suppose I agree with at least 50% of his beef with "the right" but seriously, he's thrown out the baby with the bathwater. By giving up on the right, has he gone to the left? Do you care? I guess I don't really care, but it is curious. You don't see a lot of right to left conversions.


jsid-1259671836-616988  Bram at Tue, 01 Dec 2009 12:50:36 +0000

I am a financial analyst and often work with big data sets. Whether I am using Excel, Access, or more sophisticated databases, there are rules that have to be followed.

1. Keep copies of all raw data and an audit trail of data entries.
2. Document all assumptions.
3. Admit when you have crap and start over.

They clearly had crap but were unwilling to start over.


jsid-1259671894-616989  Kevin Baker at Tue, 01 Dec 2009 12:51:34 +0000

I haven't been following the LGF saga closely, but from what I have seen, it appears more like Charles has been slowly losing his marbles. YMMV, though.


jsid-1259674851-616991  GrumpyOldFart at Tue, 01 Dec 2009 13:40:51 +0000

When I was a grad student, my thesis advisor warned me to save all of my data and all versions of my codes, because someday I might be called on to show how I got my results. Silly me, I assumed all scientists operated this way.

I'm not a scientist or a programmer, nor could anything I've ever done be stretched to "works in the field" for either field. I've just seen what was so obviously a "common thread" from people in such diverse fields it never occurred to me that anyone who was halfway educated wouldn't know this.

From my chemistry teacher, freshman year in high school, 1973:
"If you can't measure it and document it, it isn't science."

From an instructor at a Sharp company-run school to fix copiers, printers and faxes:
"If you don't have the paperwork to back up what you did, all you have is a line of bullshit."

Painted in foot-high letters on the back wall of the personnel office, USS Saratoga (CV 60):
"Non Scriptum, non Est.
If it isn't in writing, it doesn't exist."

That suggests to me that the "scientists" at East Anglia are trying to do... what? Anyone?


jsid-1259674854-616992  Rich at Tue, 01 Dec 2009 13:40:54 +0000

Lots of good software engineering speak here and lots of good practice mentioned. Okay - I was a software engineer for 28 years, many industries including banking before O went back and got a PhD in information systems. Now I teach some programming, SA&D and a variety of other courses. Whether a school teaches good programming and QA practices depends on the school and their orientation. A number, can not say whether it is large or not, are not interested in commercial programming. By that I mean people going out and working for companies doing the day to day stuff. They are interested in elegant algorithms. I.e. many go and get PhDs in data algorithms and that is what they want to teach. Honestly many of them would not survive past the first week in a real programming department. We are lucky in that we have a few extremely well qualified software engineers who are teaching as adjuncts and who teach our software engineering courses, that is not the usual thing.
There is a major difference between writing code that has to work day in day out, do the right thing or fail gracefully and stuff that often gets written by academics. Not all but often.


jsid-1259678691-616993  Adam at Tue, 01 Dec 2009 14:44:51 +0000

"Honestly many of them would not survive past the first week in a real programming department."

Indeed, but this applies to really any group involved in academia. Unless you genuinely enjoy teaching (and all of the nonsense associated with it) or doing research, I can't think of many good reasons for taking what is usually a cut in pay to teach.

Of course, this ignores a great other deal of factors, but it's based on my own experiences with university. Anyone I had as an instructor was either interested in research, retired, or flat out couldn't make it in the "real world" (the same "real world" every public teacher goes on and on about, despite running away from it and ending back in education).

I'm not sure I've ever had a sane or competent instructor who just "liked to teach" since I was about seven. I can't imagine working alongside the likes of Markadelphia and not wanting to take a hammer to my own head every night.

The research-oriented professors were always interesting, but without fail (anecdotes here - I'm not making *too* many generalizations) they had extremely odd ideologies and beliefs.

The retired people were great.


jsid-1259679657-616994  Bilgeman at Tue, 01 Dec 2009 15:00:57 +0000

U-J:
from your linked article...
"Eco expert: Fred Pearce is an environmental consultant to New Scientist magazine"

I don't know what kind of qualifications are required to style oneself an "environmental consultant" in the UK are, but obviously a grasp of combustion science as required by the US Coast Guard for anyone seeking an endorsement as a Marine Fireman is not among them.

To wit:
"We've all noticed it. The filthy black smoke kicked out by funnels on cross-Channel ferries, cruise liners, container ships, oil tankers and even tugboats"

Let me give you the quick and dirty on this.

If you're on a motor vessel,(which means a diesel), and you see continuous black smoke from your stacks, you likely have a dirty injector or broken rings in one or more pistons.

If you're on a steamship,(which means boilers), your burner tips are dirty, your forced draft air flow isn't high enough or your Bunker C is too cold and isn't atomizing adequately.

Or you are on a coal-passer, and your clinker grate is moving too fast or your draft fans, again, aren't high enough and you aren't getting a complete burn.

What you WANT is a light brown haze from the funnel. Black smoke is wasted fuel...and companies don't LIKE that!

Some smoke is going to be unavoidable, especially when you're changing loads rapidly, (as happens on a harbor tug during docking maneuvers).

I've sailed in the UK Merchant Navy under the Red Ensign, and I can testify that that the Limeys are perfectly competent to troubleshoot and replace a bad injector or set of rings.
(Although changing the rings on a Sulzer or MAN B&W slow-speed two-stroke diesel isn't the most enjoyable job in the world).

Mr. "Environmental Consultant" should himself consult with a UK Motorman or Fitter rating before posting such rubbish.


jsid-1259680003-616996  Bilgeman at Tue, 01 Dec 2009 15:06:43 +0000

Oh...and a stencil I've seen,(and may have even left), in one or more ships' engine rooms:

"GOOD Engineering is Fucking BORING"


jsid-1259687472-617003  Unix-Jedi at Tue, 01 Dec 2009 17:11:12 +0000

Bilgeman:

I'm sure he's employing a bit of poetic license there. But OTOH, I've seen many a diesel engine be kept running rather than overhauled due to "minor" issues.

Black smoke is wasted fuel...and companies don't LIKE that!

But they also don't like the repair bills, and many haven't done the cost/benefit ratio... Or can afford to take it out of service for servicing.

But that aside, he wasn't really talking about the diesels, but the bunker fuel oil use - and that might well be much darker than normal or well-tuned diesel exhaust.

Not that I'm trying to impute your hands-on knowledge, but when you're burning that "grade" of oil, which would otherwise at best be used for asphalt, aren't there a lot of solids released?

As to the rubbish, well, my personal experience has seen many a black cloud coming out of diesel engines in vehicles that didn't seem to have the utmost in upkeep... I think I'll give him a pass on that.


jsid-1259690076-617006  Sarah at Tue, 01 Dec 2009 17:54:36 +0000

Code that is built solely to a given set of inputs and spun through dev/QA cycles is guaranteed to render the correct verdict for those inputs...and no others.

This is frustrating. I know far too little about software development to understand this in relation to what I do. I'm curious, if anyone wants to take the time to explain.

I have a program that models Gaussian profiles to emission lines and then algebraically calculates the parameters of the Gaussian (height, width, etc). I test this on a series of spectra with known parameters. (BTW, I do not run the same test series every time the code is changed; I make a new test series each time.) The program returns the correct values. I inspect the visual output, where I can see that the program's Gaussian model matches the emission line profile very well. I then run the program on a random subset of my data. Again, the program's Gaussian models match the emission line profiles very well. I send the same subset to colleagues who use different code, and their programs return similar values. I'm now confident that my program works satisfactorily.

If anyone feels like it, I'm interested to know what is flawed about this approach, and how it should be improved.


jsid-1259690877-617013  He Himself at Tue, 01 Dec 2009 18:07:57 +0000

I only have one comment: WHERE CAN I DOWNLOAD THAT CARTOON!, ITS GREAT! GOTTA HAVE ME A COPY TO USE AS MY LAPTOP'S DESKTOP!


jsid-1259690883-617014  Adam at Tue, 01 Dec 2009 18:08:03 +0000

Sarah, the key is this part of that statement:

"Code that is built solely to a given set of inputs"

Provided you do not code *for* those inputs, your approach is theoretically correct.

Ideally, you want to code generally for *any* kind of input (including erroneous), and test against every case you have available.

The use of random cases, too, is good.


jsid-1259692058-617019  Russell at Tue, 01 Dec 2009 18:27:38 +0000

http://4.bp.blogspot.com/_V4T-lZ6_JEc/Sw9MVcVxIbI/AAAAAAAAAR4/wV1yEVvulkI/s400/clippy.jpg


jsid-1259692131-617020  Sarah at Tue, 01 Dec 2009 18:28:51 +0000

Adam: Thank you for weighing in. The program was not coded for any particular inputs; it's generalized enough to be run on anything. I'll chillax now. :)

(We may use similar tools, but research scientists and professional developers appear to inhabit very different worlds!)


jsid-1259692430-617022  Adam at Tue, 01 Dec 2009 18:33:50 +0000

"...it's generalized enough to be run on anything"

Actually, this is a good statement for me to clarify the software development position upon.

Software development is about finding *general* solutions to *specific* problems (excepting some of the paradigms around design).

Suppose I'm given a file format where characters are used to separate fields . Now, best case, I'm given the actual specification for that format and the description, but probably not. So, supposing someone gives me a couple of samples, I have a decision:

I can write a program to read in / parse each of those samples,
OR
I can write a program to generally handle the format they both correspond to.

The latter is the (more) correct of the choices. Then there are the obvious things that should apply to any field - being very conscious of and documenting assumptions you make, for example.


jsid-1259692492-617023  Sarah at Tue, 01 Dec 2009 18:34:52 +0000

He himself:

Sidney Harris has dozens of hilarious science-y cartoons at his website here, including the "miracle" cartoon. Most (all?) of them are published in book form, and are available on Amazon.com.


jsid-1259692555-617024  Sarah at Tue, 01 Dec 2009 18:35:55 +0000

Thanks, again, Adam. Good to know!


jsid-1259692687-617026  Adam at Tue, 01 Dec 2009 18:38:07 +0000

Hm... thinking on it now, that part of software design is similar to science. Sure, you could try and base all models in the field of chemistry off of a couple of molecules, but that's a specific model rather than a general one.

(Though I do sometimes wonder if chemistry has done precisely that, looking at how many systems and models have arisen just to handle a particular exception / unhandled case in its parent system)


jsid-1259694444-617030  Stephen R at Tue, 01 Dec 2009 19:07:24 +0000

Kevin -- The visual analogy is incorrect. Although ugly, it appears that the second vehicle is capable of going from Point A to Point B. The CRU code is clearly not.

Replace the wheels with cinderblocks, and hitch it up to a mule, a tree, and a bucket of chum (all pointing different directions) and suddenly it's a lot closer to accurate.


(comments on this post:
http://smallestminority.blogspot.com/2009/11/my-take-on-warmergate.html )


jsid-1259705263-617044  Matt at Tue, 01 Dec 2009 22:07:43 +0000

Code that is built solely to a given set of inputs and spun through dev/QA cycles is guaranteed to render the correct verdict for those inputs...and no others.

This is frustrating. I know far too little about software development to understand this in relation to what I do. I'm curious, if anyone wants to take the time to explain.


Sarah, I guess it depends on desired outcomes. If you're modeling a system as part of a research program, I would argue a model that spits out expected results on known inputs is a questionable model at best. See recent ClimateGate CRU scandal e-mails for examples of worst-case.

Research/academic development and day-to-day private software development, in my opinion, occupy different worlds. You'll tolerate things in the research world that would get a corporate developer like me reprimanded, demoted or fired.

For example, I work with developers who code to the requirements they receive. And only those requirements. Their unit testing conforms to the scenarios laid out with known, good test data. Guess what happens the moment that code goes out for testing? It explodes spectacularly and the defect counts reflect that.

Software is brittle and ugly. One of my requests in analyzing any potential development task is to have the customer show me the worst-case scenario. Show me the ugliest, most byzantine process you have that I am to automate. Because that worst-case will reveal more about the true nature of the problem than the simple ideal cases often presented.

If my code can handle, corral and manage the worst-case, it will likely handle the others just fine. I operate on a philosophy that is very simple when building a software system: "How can I break this?". I then work backwards in answering that question.

The end result is, generally, quite robust software that can handle cases that weren't tested for but assumed, data points outside the elements we fed it in test but valid for the system and handling errors that never occurred in the idealized test cases but happen all over the place when put into the real-world with real data.

Like most software engineers who've got 20+ experience in this industry, the scars of the past help me avoid future ones. Great developers always avoid mistakes of the past and try to minimize future ones.

QA is not my enemy. They are the final check on my assumptions and are there to back me up when I forget something. But ultimately the responsibility is mine since I am expected to know the systems I build better than them and know how they'll fail. I do and my low to non-existent defect counts during test and production lifecycles is testament to that.

As you can tell from this thread, I'm not the only one who knows how to play the game and do it well.


jsid-1259706700-617045  Rick R. at Tue, 01 Dec 2009 22:31:40 +0000

Sarah,

Another way to help your code along is to do some definate "out of the envelope" testing after you've tested "known" cases. ("Case testing".)

Enter data that just plain doesn't make sense in your field -- stars that "emit cold", or are fusing at room temperature, etc.

If you get results that look "reasonably normal" from utter GIGO data -- you've got bad code.

That's one of the problems with some of teh AGW models -- no matter how funky the data you give them, they keep coing back with the same basic result: "People making planet warmer, oceans will rise -- BAD". Even if you feed them utterly ridiculous data.

This has been known AND PUBLISHED for well over a decade now. But it has been ridiculed as "Global Warming Denial", when in fact it simply shows the friggin' models are garbage. (SCIENTIFIC answer: "Make better models that can replicate known results and DON'T have a bias to always return the same basic results.")


jsid-1259732913-617052  GrumpyOldFart at Wed, 02 Dec 2009 05:48:33 +0000

But it has been ridiculed as "Global Warming Denial", when in fact it simply shows the friggin' models are garbage.

And the very fact of the ridicule, the opacity and the coverup strongly suggests that those relying on the models for their conclusions knew they were garbage, and refused to take out the trash and try again because it gave them the results they wanted.

In the end, it boiled down to a choice between getting their pet theory accepted or keeping their integrity. Personally, I'm not very impressed with their judgment.


jsid-1259758318-617055  Sendarius at Wed, 02 Dec 2009 12:51:58 +0000

And the very fact of the ridicule, the opacity and the coverup strongly suggests that those relying on the models for their conclusions knew they were garbage, and refused to take out the trash and try again because it meant that they kept getting .gov grants.

There you go, fixed it for you.


jsid-1259762527-617056  geekWithA.45 at Wed, 02 Dec 2009 14:02:07 +0000

Sarah, your approach is sane and sensible. The enemy is code that implements overly specific algorithms that accounts for only a narrow range of conditions.

Essentially, code is written to specifications. Specifications are derived from requirements. Requirements are extracted from people who are (hopefully!) content experts, but don't know dick all about the best practices of robust software engineering...and it's your software architect's job to render that mess into something sane and actionable.

Typically, the process begins with the content expert (or worse: some sales guy blindly relaying the CE's wishes) stating something like "I want a program that XYZs", where XYZ is usually the simplest, mainstream case he knows of. XYZ may not even represent the *normal* case, and a lot of the time, it doesn't even represent a coherent statement of the problem space to be solved. It represents *his* solution to *his* guaranteed to be incomplete, and possibly even wildly faulty understanding of the problem space.

Running off and building XYZ is rarely the right thing to do.

Probing the reason he wants XYZ, to discover what his problem really is is always the right thing to do.


jsid-1259765800-617057  Rick R. at Wed, 02 Dec 2009 14:56:40 +0000

GrumpyOldFart:

I wouldn't even go so far as to label Anthropogenic Global Warming as a "theory".

They haven't even come within Hubble Telescope range of meeting the requirements for a "theory".

What they have is a "hypothesis" with little or no data to support it, and a Hell of a lot of data that refutes the data that they DO have. For example, historically, CO2 increases FOLLOW warming periods. We know this from several wildly different approaches which all yield the same basic result -- "Planet gets warmer, THEN CO2 increases."

If A FOLLOWS B, then A CANNOT be the CAUSE of B -- at best, it's an EFFECT of B. (Or, it could even be a correlation with zero causative relationship.) Unless the AGW crowd can provide a decent explanation that incorporates a time machine, this indicates that CO2 IS NOT a major cause of global warming.

So they have a BUSTED hypothesis, that they have been passing off NOT as a hypothesis, not even as a theory -- but have been treating it as an established "Natural Law". While forgetting that even "Natural Laws" are subject to being revisited if new data contrasdicts them in any respect. (If I am not mistaken, not a SINGLE "Natural Law" of 500 years ago is accepted as such today. Many were revised or discarded in the last 100 years alone.)


jsid-1259766645-617059  Rick R. at Wed, 02 Dec 2009 15:10:45 +0000

Slight edit:

". . . and a Hell of a lot of data that refutes the _explanations_of_the_ data they DO have."


jsid-1259767233-617060  Ed "What the" Heckman at Wed, 02 Dec 2009 15:20:33 +0000

PJTV went to Copenhagen to try to find out what the greenies think about Climategate. Their answers were fascinating.

As for software development, I've been spending the last several months working on reconciling records between two relational databases. The previous developer had apparently not thought about simple techniques like… Testing! Even worse (in the relational database world), the relations between records in the old system were based on values that could change (the owner of a truck or trailer) and the relations in the new system are based on a single field which is not supposed to be unique. Furthermore, when they tried to update the records in the old system (use the SQL UPDATE statement) when one of the key values did change (when it shouldn't have, BTW) the UPDATE would not work. The developer obviously didn't do anything more than cursory testing. Now I'm stuck cleaning up the mess.

Unlike poor Harry, I am close to finishing a fix for most of this mess, though it's late and way over budget.

When I see code that's a disaster from guys who are supposed to be "professional programmers", it's no surprise that the CRU code produces meaningless answers and doesn't have any logical structure.


jsid-1259768428-617065  Bilgeman at Wed, 02 Dec 2009 15:40:28 +0000

U-J:
"I'm sure he's employing a bit of poetic license there. But OTOH, I've seen many a diesel engine be kept running rather than overhauled due to "minor" issues."

Without doubt, just FMI, were these marine diesels or in trucks and earth-moving equipment?

Deferring maintenance is not a cost-effective measure, esopecially with a diesel, since when you lose enough rings, you lose cylinder compression, and when you lose cylinder compression, a diesel doesn't "diesel", and is simply a very large and heavy and expensive piece of sculpture.

Heck Maersk and even Sea-Land,("Cheap-Land" as it was known to those of us who sailed it), would spring for new injectors and rings and even Sulzer cylinder liners...a job of work even LESS fun than changing rings.

"But that aside, he wasn't really talking about the diesels, but the bunker fuel oil use - and that might well be much darker than normal or well-tuned diesel exhaust."

Nope, not really. i sailed the Sea-Land Voyager and the Sea-Land Developer, both D-9's, (Turbocharged two-stroke slow-speed 9 cylinder Sulzers), which burned HFO, (heavy fuel oil), and they emitted light brown haze.

"Not that I'm trying to impute your hands-on knowledge, but when you're burning that "grade" of oil, which would otherwise at best be used for asphalt, aren't there a lot of solids released?"

We have cenrifugal purifiers for the fuel oil which removes the water and the sand and most other particulate and liquid impurities before we pump it to the day-tanks for the fuel pump to take a suck on and feed it to the injectors. Injectors do not LIKE particulates, since it erodes the orifice and the needle valve seats, and if your injectors are "dribbling" rather than "misting" the fuel charge, you again have a very large piece of sculpture.
Get a job as a 3rd Engineer on a motorship and you will essentially "marry" the purifiers.
The 2nd A/E usually gets to dick around with swapping injectors,(RHIP!).

The reason we use HFO is that it is is cheap. This is a very essential consideration when your consumption rates are measured in 40-gallon barrels per hour...tons per day.

We could run on lighter stuff, but only the Navy and Coast Guard can afford more highly refined MDO,(Marine Diesel Oil).

What the Enviro-Luddites are REALLY buggin' about on the water is Nitrous Oxides and Sulfur emissions.

Look, most of this is little more than a stalking horse to actually wage economic warfare against each other...much like the Montreal Accords on refrigerants was designed to make you buy a new air-conditioner or heat-pump. That's all THAT was really about. A tight A/C or Reefer system is a tight system. When it's no longer "tight", you don't A/C or Refrigerate very well for very long.
You can let it suck extra R-12 or R-22 all you like, but at a head pressure of 250 psi or so, that gas isn't going to saty in the system long at all.


jsid-1259776772-617073  Sarah at Wed, 02 Dec 2009 17:59:32 +0000

Adam, Matt, Rick, geek: thanks very much for the detailed explanations -- they helped!


jsid-1259785475-617090  DJ at Wed, 02 Dec 2009 20:24:35 +0000

"The reason we use HFO is that it is is cheap."

And those big diesels are among the most efficient engines around in terms of shaft horsepower-hours of energy produced per pound of fuel consumed. The combination makes for cheap shipping.


jsid-1259790578-617101  Adam at Wed, 02 Dec 2009 21:49:38 +0000

What Sarah wrote:

"Thanks very much for the detailed explanations -- they helped!"

What Sarah is probably thinking:

"...they helped confirm that you people are nuts."


jsid-1259791616-617102  Ed "What the" Heckman at Wed, 02 Dec 2009 22:06:56 +0000

Adam, she already knew that. ;)

Is anyone else amazed that we now have a 60+ comment thread without Marxy?


jsid-1259792278-617103  Rick R. at Wed, 02 Dec 2009 22:17:58 +0000

Adam:

I'm not nuts. The guys running the CIA mind control satellites are nuts. So when I do or say nutty things -- it's their fault.


jsid-1259801161-617113  Kevin Baker at Thu, 03 Dec 2009 00:46:01 +0000

Is anyone else amazed that we now have a 60+ comment thread without Marxy?

Now that you mention it . . .


jsid-1259851006-617149  Rick R. at Thu, 03 Dec 2009 14:36:46 +0000

It's kind of hard to blame academic misconduct combined with a massive consipracy to commit fraud by these guys on Bush.

Accordingly, no Marxy.


jsid-1259852635-617154  Adam at Thu, 03 Dec 2009 15:03:55 +0000

"It's kind of hard to blame academic misconduct combined with a massive consipracy to commit fraud by these guys on Bush."

He already *did*. He's excused their conduct basically with the "right-wingers made them do it" defense.


jsid-1259852792-617155  Adam at Thu, 03 Dec 2009 15:06:32 +0000

For the link, it's here:
http://www.haloscan.com/comments/khbaker/6993202387669259908/#616309


jsid-1259855504-617158  Rick R. at Thu, 03 Dec 2009 15:51:44 +0000

Hmmph. Missed that.

Got it. The destruction of the relevant raw data (and the concealment of that fact) in the 1980's was justified by the 1997 opposition of "conservatives" (funny how 95 out of 100 senators are considered "conservatives", when only 55 Senators were Republicans) to the Kyoto Protocol.

That huge conservative, Bill Clinton didn't even bother submitting the Kyoto Protocol to the Senate for ratification, so that also justified the destruction of the data (and the concealment of that fact), almost 15 years earlier.

Yeah, because AGW proponants conduct regular Ouiji Board seances to Nostradamus, so they knew in advance that "conservatives" (like Bill Clinton and 40 Democratic senators) would reject their science to the point where the AGW crowd would have to obfusctae, exaggerate, deny, obstruct, and destroy data -- only so they could get a fair heraing, that is.


jsid-1259856399-617160  Ed "What the" Heckman at Thu, 03 Dec 2009 16:06:39 +0000

"funny how 95 out of 100 senators are considered "conservatives", when only 55 Senators were Republicans"

According to Marxy, there are no liberals, only conservatives (which is anyone to the right of him) and centrists (who are just like him). It's all in the definitions.

"`When I use a word,' Humpty Dumpty said, in rather a scornful tone, `it means just what I choose it to mean -- neither more nor less.'

`The question is,' said Alice, `whether you can make words mean so many different things.'

`The question is,' said Humpty Dumpty, `which is to be master -- that's all.'"

—from Through The Looking Glass by Lewis Carroll


jsid-1259877092-617199  GrumpyOldFart at Thu, 03 Dec 2009 21:51:32 +0000

The problem began (as usual) with the right's complete inability to admit fault or responsibility for...well...anything. Not surprising and very typical.

Then it became their mission to point out ANY data that even remotely contradicted the climate change theory as being proof positive that the entire theory was wrong and always will be, Amen.


Or in short, "The Right Wingers forced them to fudge the data and the methodology by refusing to support their position solely because they said so. By demanding solid proof, the Evil Right(tm) forced them to fabricate some. See? It's all their fault!"


 Note: All avatars and any images or other media embedded in comments were hosted on the JS-Kit website and have been lost; references to haloscan comments have been partially automatically remapped, but accuracy is not guaranteed and corrections are solicited.
 If you notice any problems with this page or wish to have your home page link updated, please contact John Hardin <jhardin@impsec.org>