Highlights of the Link Discussion List


Tony Barry's Link list is a thoughtful, unmoderated, Australian mailing list for discussing the Internet, information management, censorship, privacy and the like.  Link began in January 1995 and I have been an active member since December 1995.   There have been many fascinsating discussions.

The Link List home page is:  mailman.anu.edu.au/mailman/listinfo/link    

The archives can be searched and browsed at:  www.anu.edu.au/mail-archives/link/ .

On 15 June 2002 I started a discussion about the failings of education and paper qualifications in technical fields such as computing.  Several people said I should make my initial piece more widely available, but I think it is too long for a newspaper article.   Below is my contributions to the debate, which quote from some messages by other people.  To see the full debate, please read the Link archives:

I will be happy to maintain this page as a pointer to discussions on Link which anyone thinks deserve greater prominence than being buried in a mailing list archive.

Robin Whittle   rw@firstpr.com.au 16 June 2002
Back to the main First Principles page for all sorts of techo, systems-administration, show-and-tell and fun things, such as the world's longest Sliiiiiiiinky.




Link discussions of special interest





Academia, technical skills etc.

This discussion was prompted by a thread started by Rachel Polanskis - "MCSE tale of woe" - pointing to a Sydney Morning Herald story about how a Microsoft Certified Systems Engineer (I think this is what it stands for) charged a small fortune for totally messing up an unstable computer.  The URL of the story is http://www.smh.com.au/articles/2002/06/02/1022569848471.html .  The story, by long-time computer jouro Charles Wright, is here for posterity: MCSE_Tale_of_Woe.html .  Skeeve Stevens responded with an account of an IT training company who could not keep its servers running: http://www.anu.edu.au/mail-archives/link/link0206/0294.html .  This lead me to write up a long-standing rant, which I have occasionally expressed, about bad patterns in the teaching of electonics and the like.

My first message 15 June http://www.anu.edu.au/mail-archives/link/link0206/0299.html :

Skeeve's tale of woe (Re: [LINK] MCSE tale of woe) illustrates a pattern
I think has long existed an academia where practical, rigorous,
engineering subjects are taught. I wrote about this a few years ago on
Link, I think.

The people who run the institutions have to hire teaching staff, but
they have no idea how to tell whether the potential teacher has genuine
expertise about the subject they will teach - except by referring to
paper qualifications.

Some people who are vitally interested in a rich, deep, important,
ever-growing and demanding field often couldn't be bothered with paper
qualifications. There are exceptions, such as in medicine, where you
rightly have to have the qualifications before being able to practice.

It is too often true that those who can't do, teach. Furthermore, those
who can't do, or who don't even really understand the field, are often
tempted to get paper qualifications as a job ticket.

Quite a few people have no innate capacity for understanding anything of
genuine engineering complexity. They may not even appreciate what it is
they don't know about engineering, so they may consider themselves
qualified and useful if they pass the various formal tests. They may
be good with word associations, playing the academic game etc. but those
skills are useless when confronted with a complex and unknown situation,
which must be resolved using high level debugging skills and with a
great deal of knowledge of the many technologies and human factors in
the situation at hand.

There seems to be a self-perpetuating cycle of people with paper
qualifications being less likely to possess the real knowledge of the
field for which they are supposedly qualified.

These people with paper qualifications are hired to run the courses, set
the exam questions and therefore control many factors which determine
who will be the next generation of paper-qualified people, what of value
they will know, and what they will think they know.

I once encountered a fully academically qualified electronics engineer
who did not know the resistor colour code. This is like a computer
engineer who can't instantly tell you that 0011 1110 is 3E in hex or
that 0D 0A is a newline for Windows machines.

Real skills with complex technologies are hard to teach, hard to provide
a genuine learning environment for, and hard to examine. It may be
easier with software, web design etc. But electronic hardware and real
live computer administration presents many challenges to the educator.

I used to work at a TAFE college 20 years ago. There were a few really
keen teachers who were dedicated to making the best of things for
students, with as much hands-on experience, building and debugging
electronic systems as possible. Most of those teachers wouldn't cut it
as a technician - but they were training people to be electronic
technicians. None of them had any real-life work experience in
electronics. (The electrical trade section was entirely different - all
the teachers there were sparkies from way-back, and they did great work
training apprentices.)

One of the technician teachers asked me why I squirted water onto the
cleaning sponge of a soldering iron. Clearly, he had never used a
soldering iron properly. Another asked me to check the calibration of a
new oscilloscope. "What sort of errors does it have?" I asked, thinking
perhaps he was being fussy about one or two percent. He replies that
some channels seemed to be out by a factor of ten. He had never heard
of oscilloscope leads with x10 attentuators built into them - as
virtually all oscilloscopes have. These same teachers would be leading
a class of calculator-pressing young men (and sometimes an intrepid
young woman) through phase diagrams, AC and DC equivalent circuits, real
and imaginary currents, Norton and Thevenin equivalents etc. etc. (as
per the exam requirements) yet if I had walked in with a BC108
transistor (a common or garden transistor) virtually none of the
teachers would know it was an NPN, have any real idea of its maximum
voltage and range of beta (current gain) or be able to tell me which
leads with emitter, collector or base. The same goes for most of the
students - with the exception of those students who had an at-home
interest in electronics.

None of this high-level mathematical stuff is any use at all unless you
have a hands-on physical-feel understanding of electronics. Design
without long experience repairing equipment is madness. Many design
issues are not related to things which can be expressed mathematically.
So much depends on "common sense", physical robustness, serviceability,
thermal design, resistance to dust and vibration etc.

It takes years of genuine work experience to develop the debugging and
general knowledge skills to be a decent technician. (I don't care for
the "technician/engineer" "officer/grunt" distinction at all.) I don't
see how those skills can be acquired in an academic course alone - and
most courses do emphasise the importance of work experience.

Since academia is often poorly paid compared to those who really know
their stuff, especially in IT (there was a phrase like "gold collar
workers" or similar), why would anyone with genuine expertise become a
teacher? Few do - but there are people who like the social contact of
teaching, the different working environment, and who have both a passion
and an aptitude for inspiring and helping people to learn. But even
then, they have to teach by someone else's syllabus - probably the work
of a paper-qualified committee. As teachers, their main task is to help
students pass the exams, not to help them really learn what they need to
be useful in the real world. The teacher's performance, for official
purposes, will be judged on their student retention and pass rates, not
on how well those students actually perform in the real world in the
years and decades to follow. Full time teaching typically means that
they do not get to extend their skills or remain up-to-date with
changing technology.

I think it is really difficult to reliably test people's real skills in
electronics, computing etc. So much of electronic and computing work
involves debugging really tricky situations that it is impractical to
set up such situations in an exam situation. Even for a skilled
person, the time to resolution may vary enormously due to perfectly
sensible decisions about what path to pursue first. Resolution may
take much more time than would be suitable for an exam. Also,
resolution involves access to web sites, discussions with colleagues,
discussions with the people who run the system etc. It involves
personal skills and a well developed capacity to look beyond the
purported problem space into other related matters. For instance,
the customer may report a problem in some specific way, but the real
problem may lie in other equipment, in their usage of this or other
equipment or in their faulty expectations and understanding etc.

Many problems are intermittent. I once had a DX7 musical synthesiser
which crashed and clobbered its battery backed up memory - but *only*
when first turned on after being not used for a week! Eventually, by
good fortune, the problem became more prevalent and I was able to use
heat to prove it was one custom LSI chip, which must have had a faulty
pin driver which was corrupting the data bus when first turned on.
That resolution took a month!

You can see the results of paper-qualified people being isolated from
the real world in the many security vulnerabilities found in Microsoft
software. When Outlook (Express) hands an HTML email to MSIE, and MSIE
finds an attachment with a mime type indicating it is for Media Player,
and Media Player looks at the file and decides it is not the right type,
and then hands it to some other thing which decides it is an executable,
and very helpfully *runs* it, then you know that this whole system was
put together ("designed" seems too strong a term) by people who had not
the first clue about security.

Another problem is the vastly expanding need for technical expertise.
In the early to mid 20th century people with a technical electronic bent
did "radio". Then it became electronics. Then computers and software
was developed. Now we have various aspects of electronics, programming,
system administration, computer building and repair, web site design,
database management etc. Each of these fields keeps growing more and
more complex, and being useful in one field often means having a working
knowledge of several others. Yet there is a constant supply of people
with minds which can *really* do this stuff well.

I think that the increasing complexity of everyday things acts as
a real impediment to anyone wanting to learn many aspects of
technology.

My first electronic devices were things such as a 1940 Mullard valve
radio. With a little theory, it was perfectly possible to understand
the whole thing without a schematic. Even without a databook, you could
see what the valves were (rectifier, beam power tetrodes, pentagrids,
pentodes etc.) and see their pin connections. Then you could draw the
circuit in an afternoon, and with a multimeter and perhaps an
oscilloscope you could fix faults and modify it.

Now, with LSI chips, surface mount etc. etc. most everyday items are not
at all amenable to understanding, debugging, repair or modification.

Also, I think the educational system has been run by humanities types
and an increasing proportion of women, who generally have no
understanding of or interest in engineering. There was even a fashion
against science and engineering at one stage - as if these were the
roots of war, power elites and environmental destruction and so should
be avoided.

I don't know any teenagers now who are anything like as advanced in
electronic understanding as I and my radio club colleagues were. (The
Camberwell Grammar Radio Club no longer exists - we used to do all sorts
of things there, including ballistics research and 40 metre amateur
radio.) Some young people in their late teenage years may be getting
into computer programming - which when I was 14 meant hand-punching
cards in a cut-down version of Fortran, to be transported to Monash
University and run on the Control Data 3200 and returned with a
print-out a week later!

But even in computer programming there are pervasive problems. There is
a generally low standard of documentation of the code inside the source
files - which I put down to it being done mainly by young men under the
influence of young warrior/hunter instincts, which have them strut their
stuff, showing what they can do, but not giving anyone any clues at all
about how they do it. This is a field where some more feminine, English
sentence, communicative aspects of humanity are sorely needed! Another
possible factor in the too-terse, badly expressed, badly laid out or
non-existent comments is that it takes up too much space in printed
books, which constitute a guide for many programmers.

There are a number of vicious circles at work here, including:

1 - Increasing complexity and pervasiveness of an increasing number
    of technologies while there is a finite number of people with the
    mental makeup, skills and passions to do it properly.

2 - Widely deployed systems written in lousy ways (Microsoft, for
    instance) firstly setting low standards by which others are
    judged and secondly requiring vast armies of people to mollycoddle
    the second-rate systems into working as desired.

3 - The vicious circle of academics being chosen only by paper
    qualifications, of there being many reasons why good technical
    people do not become teachers etc.

4 - Many people with good technical skills being hobbled by their own
    lack of human communication skills, including their inability or
    unwillingness to clearly document their work for the next poor
    sod who tries to understand it, which will probably be themselves.
    
    So, when debugging or extending the software, they spend
    excessive time trying to understand what was in the original
    worker's mind (probably themselves) rather than using that knowledge
    from good documentation and therefore going straight into the fray
    fully informed.

5 - Complexities of everyday electronic items being not at all conducive
    to teaching or understanding by beginners. Compare a PC or video
    game, to a crystal set or superhet radio.

6 - Pervasive and generalised factors mitigating against patience and
    long attention spans, including massive exposure to TV and
    video games, and also potentially problems with nutrition,
    mercury in vaccines (http://www.autism-mercury.com) etc. which makes
    many people too scattered and impatient to spend years learning
    something difficult like electronics or computer programming.

I just got an email from a journalist friend. The company she works
has ten or twenty staff or so. She regularly uses free web-mail and
personal email accounts, because of unreliability of the office email.
She just wrote to me that she has had no email on the office system for
two weeks, and virtually none for the two weeks before that. She also
wrote that some co-workers can't access their hard drives. I don't
have to mention which company's software is involved in this - and how
most of the trouble seems to have arisen due to the Klez worm which
relies on this company's software's well-known vulnerabilities.

Complexity, when badly done, turns into a terrible mess.

 - Robin



There were a number of responses, some of which I partically quoted in my second message.  Please see the archives http://www.anu.edu.au/mail-archives/link/link0206/ around 15 June to read the responses!


Bernard Robertson-Dunn wrote:

> There is a world of difference between a technician and an engineer.
> It is the same distinction as that between a nurse/doctor,
> architect/builder, analyst/programmer, DBA/designer etc. They are
> different skill sets and responsibilities and have dfferent
> education/training needs.

I agree in principle.  But questions like these can be answered in a
number of ways.  There is the formal, ideal, definition of something,
and there is the day-to-day experience of it.   For instance, science -
depending on which philosophy of science you adhere to - is ideally a
rather different thing from what is often practiced.  Likewise religion
and many other fields.

My objection to the engineer/technician distinction is that I see no
pattern of qualified engineers being more capable at real-world things
then qualified technicians or unqualified people such as myself.  This
is only my experience, and I like to think it is not typical.   For
engineers to think they can design things without having spent years
fixing design faults and the effects of wear-and-tear on existing
equipment is madness - and only perpetuates the need for more people to
fix the things they design.

Likewise, I think its nuts for people to become qualified doctors
without having worked extensively as nurses.   Similarly officers in the
armed forces.   In BDSM, most people say that one should never become a
dominant without first being a submissive.


> Having said that, I fully agree with Robin's assessment of the current
> state of academia and the way industry uses the products of our
> current training systems. In fact I would go as far as to say that it
> is even worse than he claims.

The fact that a publishing company can pay good money for what it (and
most other companies) think is the best available software, and have a
full-time staff-person (no doubt with Microsoft paper qualifications) to
keep it running smoothly, whilst also engaging outside consultants, and
still have no email for two weeks and a number of completely
non-functional PCs indicates that things are very bad indeed.  These
people are just trying to run an office - not to nuclear physics.  I the
productivity cost of the Klez worm is probably several percent of GDP of
affected countries - and this is primarily due to a stupid weakness in
Microsoft software which no-one in their right mind should ever have
foisted on customers.  It is also due to people in charge of
Internet-connected computers being clueless enough to click dodgy
attachments.


> Having done a couple of post graduate degree courses in Control
> Engineering (MEng and PhD) I decided to update my library and bought a
> recent book on the subject. It covered most of the topics that I
> remember doing thirty odd years ago, but there was one major
> difference. It assumed that you had MatLab, a simulation package.
>
> In my day, you either worked out the maths by hand - and gained a
> thorough understanding of the problem, or you built devices and
> learned from reality. Now, and this seems common in other areas of
> electronics, students use computer simulation.

This does not seem unusual to people raised on video games.

I once lent a two valve (2 x 12AX7 I think) hand-wired computer module
from a 1950s computer to a friend who was in her final year of
engineering or its technician equivalent (it shouldn't matter at all
which, for the purposes of my argument).  I had never bothered to write
out the circuit, but I figured it was probably a dual flip-flop, since
it had a few metal film resistors, mica capacitors and compact selenium
diodes.  I suggested she draw out the circuit and try to figure out what
it did.   She was keen and bright and a few weeks later came back with a
CAD drawn circuit which may have been technically correct, but which
bore no resemblance to common-sense ways of arranging components between
ground and the positive supply.  Her approach to trying to figure out
what it did was to turn it into a SPICE model and simulate it on the
computer!!!   This is wholly deficient and useless.  She did not have
any viable theory as to what it was.   Anyone with genuinely useful
electronic skills should be able to draw the circuit by hand and by
looking at it and then by thinking alone, figure out what it is meant to
do.

Computer simulation is used for all sorts of things now, and I
understand that there are ways of integrating almost everything into the
simulation / design process.  You can have the innards of CPUs, all the
chips, and programmable logic devices simulated along with all analogue
and RF components, complete with the capacitance and transmission-line
characteristics of the circuit boards all simulated and linked with the
various PCB, PLD etc. design programs.  I think they can even simulate
electromagnetic emissions and susceptibility to interference!  

By the way, I am always on the lookout for things which used to be
impossible firs becoming concrete, weird, expensive artefacts, and then
becoming cheap, standardised commodities, and ultimately being reduced
to a pattern of bits in a computer.  Now this has happened to CPUs!  You
can get a logic design for a complete 100 MIPS CPU, with RAM and all
I/O, to put in one part of a programmable logic device.  You might even
program two such CPUs into the one chip, together with all the other
logic guff to interface them.  Thus, a complete CPU is reduced to a
pattern of bits in an Electrically Erasable (Flash) PROM to be loaded
into a RAM-based programmable logic chip at power-up.  Then simulation
would involve a PC running a program to simulate a logic chip which was
implementing a CPU - and then simulating how that CPU executed the
software which was loaded into its program memory.  


While there is good cause to use this stuff to try to eliminate problems
before committing to a ten layer surface mount PCB design, often with
the pinouts of PLD chips set by the particular logic design and the
chip's inability to route any signal to any pin, the whole thing sounds
like a nightmare.  I am always impressed that anyone can do this and
bring mobile phones, video cards, hard disk drives and motherboards into
reliable existence before they become obsolete.

Firstly, you would have to know that all the models of all the
components were correct - but how could you know?  Secondly, you would
have to be sure the simulation software was free of bugs.  Thirdly, the
simulation would have to run fast enough not to bog down the design
process with overnight and weekend runs.  Finally, the real test for the
device being designed is the real world, and I can't see how one can
reliably simulate a GSM phone, without having also a simulation of a
base-station and all the vagaries of radio-wave propagation.  

Companies sell this super elaborate software - but I can imagine a team
of engineers taking months and months to learn it, to debug their
component definitions etc. etc. - all of which detracts from actually
designing the thing.
 

> Computer simulation is a powerful technique, but it is no replacement
> for real-world experience.

Exactly.  One remembers the smell of a 1 watt resistor which dissipated
ten watts - and the burnt finger!   What about the way a tantalum
capacitor connected the wrong way round will work quite OK until at some
stage in the next seconds, minutes, days or years, suddenly in less than
a second, burst into flame, emit a puff of smoke (purple smoke perhaps -
I have seen it a few times) and then resemble a piece of burnt corn with
two leads attached.

There are so many interactions in devices which cannot be simulated.
What about the metal fatigue on solder joints of inductors in the
horizontal drive circuits of video monitors or likewise in switching
power supplies.  Only experience as a service technician would make you
learn that there are powerful forces on the wires in inductors and that
these are capable of fracturing solder joints.  Likewise expansion and
contraction due to temperature changes.


> I have blown more fuses, transistors, transformers etc than most
> people will ever see. Am I a better engineer for it? probably.

For sure!


> Am I appreciated for it in the work force? As it happens I work
> for a company that does value experience over youth and trendy
> knowledge, but not all do. Does that have anything to do with my age?
> Probably, but that's another subject altogether.

Assuming that you are working effectively in a field, then you should
get better at it.  But what of the MSCE who wasn't working at all - just
stuffing around.  In what way would that person be better at his work
after making a total mess of something because he lacked even the basic
clues required to resolve the problem?


> What can we do about it? Now there's the real question.  Has anyone
> ever been able to change society in a predicatble way?  IMHO the
> answer is no.  It is easy to change society, bombs, guns and power
> hungry politicians do it all the time. Is the resulting change what
> was intended?  Not in my experience.d

It may be what they intended.  I don't think things are so bleak, but it
is the failures which tend to stand out.  Paul Keating had the right
idea putting Arts in with Communications.


The appears to be a lack of proper skills and motivation amongst quite a
few political leaders and technical teachers.  Perhaps the solution is
something between conscription and jury duty.   If someone actually
wants to be prime minister, then they are insane and/or a megalomaniac
and so are unsuitable for the job.  There was a US movie about
dragooning some bright unfortunate into the job of president.   Even a
committee could do a better job this way than what unfettered democracy
and the TV generation did in the US.

Likewise, it might make some sense to have compulsory service and/or
high-level cash inducements to get practising, competent
engineers/technicians etc. into the teaching ranks for a year or two.
Any longer than that, unless they do it part time, and they would lose
touch with their real field of work.   The teaching of doctors and
nurses is, or at least used to be with nurses, tightly integrated into
real-world work situations with real practising professionals.   I know
institutions do try to get work experience for their students.


Glen Turner wrote:

> And just straight funding.  Your typical modern electronics
> company has at least $0.5m of software to do FPGA design,
> PCB layout, EMC analysis, and so on.

Ouch!   Then everyone has to try to make the packages talk to each
other, and read all the manuals, updates etc.  Then they have to learn
to use them all, find out their bugs and limitations and somehow
construct a giant video game which is worse then useless unless it truly
predicts the behaviour of the thing they are designing, but have not yet
built.  This is really daunting, but it is a fact that designing complex
equipment is an extremely challenging task.  This is especially true if
it is not just software and electronics, but involves servo systems, RF,
vibration, usability for non technical people, physical ruggedness etc.


> Also, electronics components prices are very sensitive to quantity.
> It's simply unaffordable to have a high-density PCBs made in
> quantities of one.  This is turn makes it hard to give undergrad
> students a 'real-world' project.

Indeed - it is very hard to provide a real-world experience in
electronics courses.  But before they are designing things, they should
be fixing and modifying things, I am adamant.

 
> This also happens in computing -- 2,500 lines of code is about
> all you can expect from a student in a semester project.  So they
> onyl start to glimpse the complexity of real-world software
> systems.

I like to think that this is 1500 to 2000 lines of comments, and the
rest actual code for the compiler to chew.   2,500 lines of actual code
with few comments is a very large project which virtually no-one can
understand, including the person who wrote it, just days or weeks later.

My own experience when I have occasionally written a few lines of code
without proper comments (the comments come first) is that I often can't
understand what I was trying to do, or be sure whether the code does
that or something else, ten minutes later.


Craig Sanders wrote:

> 7.  The Cult of Mediocrity.   This insidious cult has infected
> educational systems since the 1970s - it became more than
> unfashionable to recognise that some individuals had talents
> greater than other individuals(*).  recognition of merit was
> forbidden.  streaming was abandoned, forcing the smart kids to
> learn at a slow pace that the average and dumb kids are just
> capable of keeping up with.


Yes.  This has its roots both in the incompetence of those who came to
control the education system (though they wouldn't recognise it as such)
and in the social-constructivist theories which have assumed the status
of undisputed truth in recent decades.  In this model, at its purest,
children are blank slates and everything which they become is determined
by experience.   There is no stupidity - only someone behaving and
thinking badly because of their low self-esteem.   The self-esteem
movement has a lot to do with the Cult of Mediocrity. 

Also, passing people with 50% of their often multiple-choice questions
has got a lot to do with the Cult, because this sets people's general
expectations and determines who is regarded as qualified to teach and
set future syllabi (?) and examinations.


> the cult of mediocrity goes well beyond education, too.  it has
> debased the definition of democracy, so most people now believe that
> mediocrity and democracy are one and the same thing - i.e. the truism
> that "everyone is entitled to an opinion" has mutated into the lie
> that "all opinions are of equal worth"....no recognition of expertise,
> or merit is allowed.

Its true.  We are all going to hell in a handbasket.   Like Little Black
Sambo's tiger chasing its tail until it turns into butter, MS software,
flaky over-complex web-design programs etc. is damaging companies and
educational institutions ability to operate - and this makes it even
harder for anyone to make any progress.  Meanwhile, in China, they are
bright-eyed and busy tailed, working hard and smart, becoming proficient
in AutoCad shortly after learning to walk and churning out increasingly
astounding products for a fraction of what could be done in the
currently developed nations.

 
> it's pretty easy to spot this on mainstream mailing lists and
> newsgroups where, for example people with no experience or knowledge
> in a given field argue vociferously with others who have, e.g., done
> their PhD thesis on that topic, and insist that their opinion is worth
> as much as the PhD holder's opinion.  


Yes, but some PhDs are extremely rigorously based and some are not.  I
know one person who was doing his PhD in Canada on human communications
- and the process was very extensive, rigorous and resembled at times an
Inquisition.  It involved oral responses to a panel, literally, of
inquisitors - old beaks with PhDs and professorships probing and
challenging him on his work.  I met someone here who was well into her
PhD on the interaction of the Earth's magnetosphere and the solar wind,
and she couldn't even give a consistent explanation of the
magnetosphere.  Maybe she didn't continue or pass. 


> even worse, many of the other subscribers agree because they also are
> not qualified to judge, and have an unfailing belief in the fairness
> of mediocrity.

This is the Internet as a bullshit amplifier.  The mass media and quite
a few really crappy computer books do this as well.

Craig, I agree with what you wrote, but I suggest capitalising the first
letter of sentences in any treatise concerning encroaching mediocrity.
Perhaps you subliminally influenced Jan to slip up in this regard with
one of her sentences!


From what little I know, apprenticeships make sense in stable
industries, if there is no exploitation, the apprentice's co-workers are
competent and helpful and if the education system can work well with the
actual needs of industry and the apprentice.  Jan and I have worked on
programs in this TAFE (VET or whatever it is called this year) flexible
delivery business and it is complex and difficult thing to achieve all
these goals.  It assumes there are stable, full-time workforces with
career structures which remain stable for decades.  But web and database
design are new-fangled professions which really began in earnest perhaps
seven years ago - or maybe earlier for database design and management.
Companies are often trying to be flexible and engage people with
multiple skills on a casual or contract basis, and these workers may be
working from home.  The apprenticeship system seems to be based on a
traditional large workplace stable full-time workforce model which often
does not apply in higher-tech fields.

A big part of the answer with electronics and computer programming is to
do it at home.  Everyone has a PC, and basic electronic tools and
components cost well less than $1k.  You really need a dual trace
oscilloscope, which is another $600 or so.  But these are minor
investments compared to the investment of time for studying electronics
or computer programming.   There is a false expectation that the
institution should provide everything on a plate.   If someone is not
fascinated enough by computer programming or electronics to do it
themselves, intensively, for their own interest and probably for really
useful practical purposes, at home, then I doubt they will ever be much
use no matter how much formal education they do.


I will make a link to this discussion and probably put up my
contributions on a web page in the "Various technical things . . ."
department of my site http://www.firstpr.com.au .

 - Robin