Slashdot post asks (in a nutshell): Whose responsibility is it for training, anyway? Yours? Or your company’s?
To run the risk of sounding like some cliche: It depends now, don’t it?
If the technology is something proven – – and by “proven” I mean it’s something the company already knows that it wants you to know – – then I say that it’s their responsibility. Their are certainly some caveats to this. If you were expected to know it coming into the job, tough shit. Proprietary technologies and obscure technologies? Fuck it! It’s on the corporate account. Extensions of knowledge you already have? There’s a fun grey area. Minor additions to your knowledge you should be able to pick up on the job (they can make an hour or two for you to read an article) or on your own time. Major new techniques? They should spring for it.
If the technology is just something you think is neat or fun? Well, that’s all you, baby. The whole world may be talking about Ruby On Rails but what’s that got to do with you? Of course, if they asked YOU to experiment with that Ajax shit and you’ve got positively no interest in it, then maybe they should make it worth your while. (At least slip the man a couple $20s for the lit.)
Of course none of the above hypotheticals really addresses the question posed in their piece. W/r/t/ having been left (legacy-wise) with a system that’s (probably) mission-critical and suddenly unsupported. So in that case… That manager better have the budget left to train or they’re all screwed.
currently playing: Skylab2000 “Auburn”