

No, I literally drive a school bus. I like the gig, but as a manager he is making something like eight times what I make (and probably a lot more than that).


No, I literally drive a school bus. I like the gig, but as a manager he is making something like eight times what I make (and probably a lot more than that).


Lol we were all laid off. He’s now a manager at Comcast and I drive a school bus.


I’m currently reviving a personal iOS project that I last worked on almost 10 years ago. At the time, I was working under a (much younger) tech lead who was a firm advocate of the “all comments are bad” philosophy and reported me to management as being technically incompetent because I commented my code. Thank god I’m technically incompetent because there’s no fucking way I could be making any sense of my 10-year-old code without those comments.
Somebody here is probably going to reply that nobody literally thinks all comments are bad, but I assure that you such people do exist in this profession.


Lol I haven’t coded on paper first since I started programming … in the '70s on my friend’s Commodore-20.

During the last two years of Clinton’s presidency, we had an actual fucking budget surplus. We could have been debt-free as a nation now, instead of sitting on nearly $40 trillion owed.


I was fine with mentoring junior developers until my manager decided pair programming was the way to go. I’m happy to help and teach, but like fuck am I going to sit at the same goddamn computer with some maroon all day. Can’t even power-nap properly.


I wrote mobile apps for Blackberry back in the day. As part of their security fixation, all library modules you incorporated had to be signed as your app was compiling, even if you were just testing out a single line change. This could make your app take upwards of a whole hour to sign, if the signing servers were even up and running at all; they were often down completely which meant I could go home and get high instead of working. Which is why I never badmouthed Blackberry to my bosses.
The absurdity of having every module signed meant that I had to think long and hard about whether I wanted to use built-in library functionality or just roll my own code. For one UI I needed to use trigonometry functions. These were located (logically or not) in one of the encryption modules which were especially prone to taking a long time to sign, so I ended up writing my own sin()function (in Java) just to save myself ten minutes of compilation time.


From the river to the C


My favorite:
for (int i = myArray.Length; i --> 0; )
{
//do something
}
Perfectly valid in C-style, even if it does look a bit puzzling at first.


I started coding professionally using Visual Basic (3!). Everybody made fun of VB’s On Error Resume Next “solution” to error handling, which basically said if something goes wrong just move on to the next line of code. But apparently nobody knew about On Error Resume, which basically said if something goes wrong just execute the offending line again. This would of course manifest itself as a locked app and usually a rapidly-expanding memory footprint until the computer crashed. Basically the automated version of this meme.
BTW just to defend VB a little bit, you didn’t actually have to use On Error Resume Next, you could do On Error Goto errorHandler and then put the errorHandler label at the bottom of your routine (after an Exit Sub) and do actual structured error handling. Not that anybody in the VB world ever actually did this.


Mediocre pop star.


Visual Basic isn’t dead … it’s just resting!
I forewent (?) skin-washing instead. Now I take a shower like once every two weeks. I ask people periodically if I stink and nobody says I do, so I dunno. TBF I also forego most of my tasks and most of my unconsciousness as well.


OK, which one of the things I mentioned do you think is vastly and objectively superior to all others? Genuinely curious here.


I’ve worked professionally on Windows and Mac; using Visual Basic, C#, Java, Objective-C and Qt Creator (which is C++ and Javascript); for web apps, desktop applications, and mobile apps (iOS, Blackberry and Android). I have my personal preferences but they’re all viable platforms/languages/frameworks/devices and anything that needs doing can be done on them one way or another. The idea that one of these is vastly and objectively superior to all others is just pseudo-religious nonsense.


Windows Phone was great. I’d done Windows Mobile since 2005 and it was nice to be able to continue developing with C#/.NET and Visual Studio (back when it was still good) in a more modern OS. One thing that really spoiled me permanently was being able to compile, build and deploy the app I was working on to my test device effectively instantaneously – like, by the time I’d moved my hand over to the device, the app was already up and running. Then I switched to iOS where the same process could take minutes, also Blackberry where it might take half an hour or never happen at all.
Funny thing: RIM was going around circa 2010/2011 offering companies cash bounties of $10K to $20K to develop apps for Blackberry, since they were dying a rapid death but were still flush with cash. Nobody that I know of took them up on the offers. I tried to get my company to make a Windows Phone version of our software but I was laughed at (and deservedly so).


so you can focus on what really matters…
meetings!collecting unemployment!


That would be incredibly ironic given that they completely fucking gave up on mobile devices when the iPhone came out.


It’s kind of funny how eagerly we programmers criticize “premature optimization”, when often optimization is not premature at all but truly necessary. A related problem is that programmers often have top-of-the-line gear, so code that works acceptably well on their equipment is hideously slow when running on normal people’s machines. When I was managing my team, I would encourage people to develop on out-of-date devices (or at least test their code out on them once in a while).
Visual Sauce Safe, for us oldheads.