Home » Discussion Forum

Discussion Forum—A Way with Words, a fun radio show and podcast about language

Discussion Forum (Archived)

Please consider registering
Guest
Forum Scope


Match



Forum Options



Min search length: 3 characters / Max search length: 84 characters
The forums are currently locked and only available for read only access
sp_TopicIcon
Worse is Better
EmmettRedd
859 Posts
(Offline)
1
2015/10/07 - 11:12am

A Wikipedia article describing "Worse is Better" has a couple of suggestions for improving the entry. One asked for a reorganization to comply with Wikipedia's layout guidelines. However, the Worse-is-Better paradigm has these two jewels:

Simplicity is the most important consideration in a design.
The design must not be overly inconsistent. Consistency can be sacrificed for simplicity in some cases...

The request for reorganization is from 2012. I guess the authors want a simple presentation and has sacrificed consistency for a few year now.

Guest
2
2015/10/08 - 9:40am

I see the irony. Funny. Do you suppose the author of that entry was purposely trying to make it self-referential in an ironic sense? Curiously, if you look at Wikipedia's Manual of Style (which is linked to in that request for editing), it says:

"This guideline is a part of the English Wikipedia's Manual of Style. Use common sense in applying it; it will have occasional exceptions."  [emphasis mine]

So maybe that's why the entry has stood unedited for close to 4 years. On the other hand, if the irony was seen by the editor who posted that comment, he (or any other editor) could have removed their comment. More than likely, that entry has just been "lost" in the ever-increasing corpus of Wikipedia. I don't expect there's a lot of searches for "worse is better."

EmmettRedd
859 Posts
(Offline)
3
2015/10/09 - 12:51pm

I did not look for it either. I looked up "perfect is the enemy of the good" and is was a link in that Wikipedia entry.

Guest
4
2015/10/09 - 6:15pm

Not so surprising it took you there then. Google's search algorithms are really improving at matching "meaning" ... pretty amazing tech (whatever your search engine). So I understand the concept of "perfect is the enemy of good" and I'm sure it's been around awhile. But truthfully, I hadn't heard it till about 5 years ago when I needed lens surgery on my eyes. Had to ask the ophthalmologist what he meant. Made perfect sense in regard to additional (optional) procedures.

Weird how one can go so long and never run into what seems to be a common phrase. FYI, grew up in the Midwest, been in Arizona since 1979.

EmmettRedd
859 Posts
(Offline)
5
2015/10/09 - 7:12pm

Google did not find it; IT was a link in the Wikipedia article that Google found for 'perfect is the enemy of the good'. Similar to you I probably heard the 'perfect' saying in the last 5 to 10 years.

Guest
6
2015/10/11 - 1:43pm

I think you'll find this interesting. Check out the Ngram for "perfect is the enemy of". Clear spike in the 90s, but if you follow the links to earlier times (when the graph appears flat) you'll see examples of usage as far back as 1800. Where has that phrase been hiding all these years?  🙂

EmmettRedd
859 Posts
(Offline)
7
2015/10/11 - 5:04pm

I looked at the Ngram and the "early" references. When I limited the search to the 19th century, there were only two books and they both appeared spurious (which is a problem with the automatic OCR that Ngram uses). Looking later, the earliest book I find is 1941.

Guest
8
2015/10/12 - 10:05am

You are correct sir. Never noticed that. Always just trusted the date groupings and labels. Will look more closely next time. For example, selecting the 1800-1961 corpus, the third entry seems to show the phrase used in the Congressional Record for 1873. Following that link shows it's actually the Congressional Record for 2001. Lesson learned, thanks.

Guest
9
2015/10/14 - 7:26am

Did I misread if I somehow get the sense that ya'll consider the Wiki quoted at top ('Worse is Better') was a prank?  No,  it is serious ,  not a prank.

You will find expressions in the same vein going back to entiquity.  This mid-modern one is attributed to Voltaire:  The best is the enemy of the good.

Another version of the same idea, by no other than Shakespeare:

Were it not sinful then, striving to mend,
To mar the subject that before was well?

Actually that's an eerily apt remonstration to today's hordes of computer programmers and designers, who  won't quit putting out new versions  with only marginal improvements, only to aggravate users  who're already used to the older versions, who now have to waste time adapting to the new.  To translate Shakespeare's above:  If it ain't broke, don't  fix it! 

EmmettRedd
859 Posts
(Offline)
10
2015/10/14 - 9:06am

RobertB said

Did I misread if I somehow get the sense that ya'll consider the Wiki quoted at top ('Worse is Better') was a prank?  No,  it is serious ,  not a prank.

No, I consider it serious.

I was only commenting on the circumstance that a Wiki describing that everything (including consistency) can be sacrificed for simplicity was, for several years now, ignoring the editors' call for consistency.

deaconB
744 Posts
(Offline)
11
2015/10/14 - 10:09am

RobertB said

Actually that's an eerily apt remonstration to today's hordes of computer programmers and designers, who  won't quit putting out new versions  with only marginal improvements, only to aggravate users  who're already used to the older versions, who now have to waste time adapting to the new.  To translate Shakespeare's above:  If it ain't broke, don't  fix it! 

Someone did a study about 25 years ago, with the intent of showing that certain languages were a lot less expensive to program in, and they found something else entirely.

If you work in a lower-level language, you may have to write 20 lines of code to accomplish what can be done in one line of a higher-level language.  The study was intended to let managers demand everything be coded in Java or a 4GL instead of assembly language or COBOL.

It turns out that the best programmers were ten times as productive as some others, and when they use their favorite language, they are are exceptionally productive.  Half all programmers, when they maintain code, end up bebugging, rather than debugging it.  And although some languages seem better than others for a given task, that's not really true, either.  The leading COBOL compiler for the PC, fast, tight, small, and bug-free, is written in COBOL, which is just about the last choice anyone would make for a "systems" language.  But their exceptionally small team of developers knew the language inside and out, and it worked out well for them.

If I were hiring programmers, I'd ask them whether "If it ain't broke, don't fix it" or "The Cardinal Virtues of a programmer are hubris, laziness, and impatience."  I wouldn't even consider the "ain't broke" crowd.  If they are impatient with a bug, they have the arrogance to fix it without being told, and they are lazy enough to do it right the first time.  If I don't look like I'm busy, it's because I did it right they first time.

Guest
12
2015/10/14 - 6:38pm

deaconB said: If you work in a lower-level language, you may have to write 20 lines of code to accomplish what can be done in one line of a higher-level language.  The study was intended to let managers demand everything be coded in Java or a 4GL instead of assembly language or COBOL.

Indeed. My first program was written on punch cards and processed by a Burroughs B5500 mainframe. These days I do mostly Java. Check out this page from my blog for an example.

But to get back on topic, I totally agree with RobertB that software development fails the "perfect is the enemy of good" rule. I try to avoid upgrades as long as possible, only recently migrated from XP to W7, and need to be dragged kicking and screaming into any new OS.

Guest
13
2015/10/16 - 12:51am

Your clocks are very cool.  Looking at your clocks I am reminded of this question I never knew the answer to :   Why is atomic clock considered better than any other kind?  Obviously there are scientific processes that work better if keyed to atomic clock.  But philosophically,  if a sand clock is off by so much from an atomic clock, the atomic clock is also off by that much from the sand clock.  What other standards out there provide the rationale to favor the one over the other?  Perhaps consistency of agreement among specimens within a kind is one standard.  But between any 2 specimens, it seems to me neither one is superior to the other.

deaconB
744 Posts
(Offline)
14
2015/10/16 - 5:40am

That's why we have established standards.

The POSIX standard says that we express time as the number of seconds elapsed since December 31, 1969, figuring 60 seconds per minute, 60 minutes per hour, 25 hours per day, 365 days per year, with an extra day added each year if the year is divisible by four, unless it's divisible by a hundred, except that there's an extra day if the yeasr is divisible by 400.

But that's wrong, because we've added extra seconds to account for the fact that the Earth is grinding to a halt.  We've added 26 leap seconds since then.

Of course, we have two ways of defining a second. It's the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom.  It's also the time required for light to travel 299,792,458 meters in a vacuum. 

And the speed of light in a vacuum is only a constant because we define it as a constant. Physic currently states that "simultaneous" doesn't really exist at a distance.  But everything is defined in terms of other things, and if we don't decide that something is to be defined as a constant, the cgs/MKS system starts to eat itself alive, pretty much like Lucy trying to even up the legs on the kitchen chair so it won't wobble.

People who defend global warming on the basis that it's a scientific fact get upset with me when I point out that scientific facts are virtually guaranteed to be falsehoods.  Nobody has been willing to label something a scientific law for over a century, and every scientific law is known to be incorrect.  Even cogito ergo sum is indefensible.  We issue birth certificates not to individuals, but to something that is about 20 trillion organisms, most of which have a lifetime of less than an hour.

But is one measure better than another?  When I bake bread, I use the blue cup to measure flour, the clear glass one to measure water, and I eyeball a pile of salt in the palm of my left hand.  and I don't make any claims to accuracy, but when I go to make something new to me. I want to use a standard set of measures until I know what it's supposed to end up as.  And in that context, a measure is better if it plugs into the cgs or MKS system without sawing off all the other legs of the chair.

Guest
15
2015/10/16 - 12:17pm

Heimhenge said

Indeed. My first program was written on punch cards and processed by a Burroughs B5500 mainframe. These days I do mostly Java. Check out this page from my blog for an example.

I love the skeuomorphic numerals.

EmmettRedd
859 Posts
(Offline)
16
2015/10/16 - 12:31pm

Glenn said

Heimhenge said

Indeed. My first program was written on punch cards and processed by a Burroughs B5500 mainframe. These days I do mostly Java. Check out this page from my blog for an example.

I love the skeuomorphic numerals.

After looking up skeuomorphic and realizing Heimhenge's display was 7-segment, I wondered about skeuomorphic nixie tubes. Google quickly found these.

deaconB
744 Posts
(Offline)
17
2015/10/17 - 3:20am

When using military time, is 30 minutes past midnight 24:30. or is it 00:30?  Is midnight itself 24:00 or 00:00?

When doing calculations of elapsed time, using 00:00 would be more convenient. but if one writes 00:10, there would be a tendency drop the leading 00: and turn 10 past midnight into 10 AM, which could be more than slightly unfortunate for coordinating military movements.  

For many years, Indiana did not observe DST, and if you lived near the state line, you had to ask "Is that fast time or slow time?" when being told of an event across the state line.  It would have been especially tough. I imagine, if you lived in Union City, which is bisected by the state line.  I wonder how much it costs the networks to say "at 8, 7 central", and airlines incur costs when passengers arrive at the wrong time.  Losing sleep causes traffic accidents, and health problems.  We oughta just put the entire country on UTC/Zulu with no DST.

Guest
18
2015/10/17 - 9:38am

deaconB asked: When using military time, is 30 minutes past midnight 24:30. or is it 00:30?  Is midnight itself 24:00 or 00:00?

Midnight is definitely 0000 and 30 minutes later it's 0030. I don't think in most cases they bother with the colon separating hours from minutes.

Arizona doesn't use DST either, but a couple of the reservations in the state have opted in ... up to them as "sovereign nations" within our border. When I have to schedule with a client in another time zone, I always check the World Time Server (which I keep in my browser's favorite links in the top menu). I made that mistake once and ended up calling for the interview an hour late. Very embarrassing. Vowed it would never happen again.

I wouldn't mind putting the whole planet on UT. Would make my job a helluva lot easier, and that "Local to UT" page on my blog obsolete. But that's not going to happen. UT is perfect for coordinating global military operations, or announcing when an astronomical event like an eclipse or ISS flyover will occur. But the average person would balk at having to re-associate those times with normal daily events like rush hour, lunch break, sunset, etc. Hell, we couldn't even get them to buy into the metric system (at least here the the US).

Glenn said: I love the skeuomorphic numerals.

I had to look up that word also. I knew the look I was going for. Coding the clock to use a standard font (like the windows system clock) would have been way easier, but I wanted something that looked more "immediate" and "techy" so I went with those 7-segment numerals. Curiously, to this day, it actually looks like those red LEDs are recessed behind the front surface of my monitor. I just can't see it as "flat" however hard I try. I know that's an illusion happening in my brain, but I can't make it go away. I asked my sister, a graphic artist, how it looked to her and she said she sees no depth at all. Looks flat to her. But then, she's never built an electronic LED display.

deaconB
744 Posts
(Offline)
19
2015/10/17 - 11:30am

About 1970, I found myself using a calculator that was a little larger than a Selectric typewriter, that had a Nixie tube display.  It cost something like $5,000, I was told, in an era when the minimum wage was $2000/year.

I used it about an hour a week for a year, and it drove me batty that different digits were located at different depths. 

I'm not sure what the point would be to have a seven-segment Nixie display.  The whole point of a Nixie tube was that you had digits that were properly shaped; that is, the bottom half of a three was an incomplete circle, and the to-p was a straight line, with another straight line running diagonally from the right end of the straight line to the upper left end of the circular arc.  I remember finding Nixie tubes in the Allied catalog, and thinking they were horrible expensive, but I have no idea if that was $20 per tube or $200 per tube, just that they were outrageously expensive.  Not overpriced, considering how difficult they would have been to manufacture, but merely unaffordable to mere mortals.

Not sure exactly when the Allied catalog was discontinued.  It was an incredibly educational manual for Gyro Gearloose wannabees.  I bought from them regularly, and I've never found Radio Shack a satisfactory substitute.  And now, Radio Shack is likely to disappear, from what I read.

Guest
20
2015/10/19 - 12:42pm

Heimhenge said

deaconB asked: When using military time, is 30 minutes past midnight 24:30. or is it 00:30?  Is midnight itself 24:00 or 00:00?

Midnight is definitely 0000 and 30 minutes later it's 0030. I don't think in most cases they bother with the colon separating hours from minutes.

A while back, I did some extensive internet searches and came up with real military examples of midnight as 2400 and 0000. It seems it only appeared as 2400 when it was the END point of a time range, but my memory might be flawed. I don't think I found anything other than 0001 - 0059 for the minutes AFTER midnight.

post on 24-hour time

[edit: added the following]
True to form, my memory was flawed. In rereading my own post, I see clear examples of 2400 as the start of a time range.

Forum Timezone: America/Los_Angeles
Show Stats
Administrators:
Martha Barnette
Grant Barrett
Moderators:
Grant Barrett
Top Posters:
Newest Members:
Mike Brock
Forum Stats:
Groups: 1
Forums: 1
Topics: 3647
Posts: 18912

 

Member Stats:
Guest Posters: 618
Members: 1266
Moderators: 1
Admins: 2
Most Users Ever Online: 1147
Currently Online:
Guest(s) 49
Currently Browsing this Page:
1 Guest(s)