A resume that lists all your failures? Don’t laugh – it worked!

While this blog stands resolutely against the more juvenile attributes of the struggling Millennial generation — as witness the ‘Kim’ story of yesterday, in which a 20-year blew through her $90,000 college fund and blamed her parents (who else?) for not teaching her budgeting skills — fair is fair, and we’re equally happy to recognize entrepreneurial “go-getter”-ism when it appears.

Here’s a young man who has figured out how to make his resume jump from the vast pile of lookalikes: highlight failure.

I’ll leave it to you to read the article to get all the “business-y”  reasons why this may be good (and certainly, you can’t argue with the results). But I suggest another, less utilitarian reason why the ploy was successful: it shows a sense of humor, which in my experience usually means you’re dealing with an interesting and intelligent person.

A resume of failures stands out to employers – Business Insider.

But wait a sec, wait a sec — didn’t I dump all over the self-mocking “We suck an we’re sorry” video and didn’t an array of Millennials in turn dump all over me, some of them hoping I would die soon? And isn’t this “resume of failures” just that same tactic?

Not at all. Here’s why:

1. He’s at least using his list of failures to accomplish something. He’s not just saying “poor messed-up me” and leaving it there.

2. He’s related his list to real-life aspects of the industry (relevant resume, remember?) and taken a shrewd shot at some of the b.s. that everyone acknowledges but is afraid to say (creative award shows and new business pitches, for example). In this, he demonstrates some wisdom beyond his limited on-paper experience.

3.  He’s got guts, and he pursues follow-up publicity and recognition quite aggressively.

Very Boomer-like, if you ask me…

What if they gave a degree for proficiency rather than time spent studying? It may be starting.

Could universities ever award a degree based solely on proficiency, rather than on credit hours?

It makes a lot of sense — and it would certainly be the key to much-needed reform of higher education. According to an article on The American Interest website, The University of Michigan is apparently developing a competency-based Masters of Health Professions Education.

The article cites an NPR report that notes, correctly, that the current system measures “not how much you’ve learned, but how long you’ve spent trying to learn it”:

The conventions of the credit hour, the semester and the academic year were formalized in the early 1900s. Time forms the template for designing college programs, accrediting them and — crucially — funding them using federal student aid.

But in 2013, for the first time, the Department of Education took steps to loosen the rules.

The new idea: Allow institutions to get student-aid funding by creating programs that directly measure learning, not time. Students can move at their own pace. The school certifies — measures — what they know and are able to do.

If this is really happening, then it’s a revolutionary — and absolutely necessary — development. It will drive costs down by reducing or eliminating unnecessary courses, and bring higher education into closer alignment with the realities of the job marketplace. Too late, unfortunately, to benefit the Millennials who have racked up enormous student loan debts while wasting untold credit hours on meaningless courses that equip them for very little in the real world. But that’s another story.

You can read the entire article here.

What if the Millennials are never leaving that basement? And what if that’s OK?

In my book Beyond Age Rage, I argued that some of the “Oh my God” reaction to Millennials still living at home with their parents, was because it was new, not necessarily because it  was bad. After all, if you  can’t get a job because you’re not properly trained and/or the economy is still weak, it makes perfect sense to cut costs (maybe all the way down to zero) and room with Mom and Dad. So it’s rational. But can it even be good?

In a very thought-provoking piece in the New York Times magazine, Adam Davidson tries to get comfortable with the idea that  this is not just a temporary blip. In fact, he argues, it’s part of a very long-term evolution which has seen young people move from workers (even as children) to people who could not be expected to be in the workforce (until older, and older still, and older still):

Childhood is a fairly recent economic innovation. For most of recorded history, a vast majority of people began working by age 4, typically on a farm, and were full time by 10. According to James Marten, a historian at Marquette University and the editor of The Journal of the History of Childhood and Youth, it wasn’t until the 1830s, as the U.S. economy began to shift from subsistence agriculture to industry and markets, that life began to change slowly for little kids. Parents were getting richer, family sizes fell and, by the 1850s, school attendance started to become mandatory. By the end of the Civil War, much of American culture had accepted the notion that children under 13 should be protected from economic life, and child-labor laws started emerging around the turn of the century. As the country grew wealthier over the ensuing decades, childhood expanded along with it. Eventually, teenagers were no longer considered younger, less-competent adults but rather older children who should be nurtured and encouraged to explore.

Thus high school, college and then the workforce. The familiar pattern — until the Great Recession. Or so goes the narrative. But Davidson argues that things started becoming unstuck much earlier:

(The) latest recession was only part of the boomerang generation’s problem. In reality, it simply amplified a trend that had been growing stealthily for more than 30 years. Since 1980, the U.S. economy has been destabilized by a series of systemic changes — the growth of foreign trade, rapid advances in technology, changes to the tax code, among others — that have affected all workers but particularly those just embarking on their careers. In 1968, for instance, a vast majority of 20-somethings were living independent lives; more than half were married. But over the past 30 years, the onset of sustainable economic independence has been steadily receding. By 2007, before the recession even began, fewer than one in four young adults were married, and 34 percent relied on their parents for rent.

OK, if that’s the case, should we just relax about this? Is it really the “new normal”? Does it have any positive side effects?

Read the article – you might not feel better about what’s going on (especially if you’re the one paying the bills), but you’ll certainly have a more sympathetic perspective. (It also includes a terrific slide show detailing the stories of 14 Millennials who are living that life.)

fmfam,fm,f

Surprise! Over a third of Millennials didn’t need their BA for the first job they wound up getting.

Not that we need more proof, but a new study shows that a huge percentage of Millennials are overqualified for the jobs they eventually get. A new survey found that 35% of Millennials with a BA reported that their first job didn’t require a degree. This is in keeping with a study last year, that found that the number of university grads entering the workforce would be more than double the jobs available that require at least a bachelor’s degree. Toss in student debt (already larger, and growing faster, than credit card debt) and it’s no surprise that more and more Millennials are settling for those lesser jobs, simply because they need the cash.

And where does that lead? Right back to the universities, who will find it harder and harder to justify sky-high tuition fees (fueled by bloated administrative costs and too many irrelevant programs).

Read all the details here.

Guess what percent of recent college grads are in low-wage jobs earning $25,000 or less

OK – we know that recent college grads are struggling in the job market. But just how bad is it, really? Surely not everyone is a barista? And how do today’s trends compare with the past? January data from the Federal Reserve Bank of New York, reported in The Atlantic, offer a sobering – but not necessarily calamitous – picture.

Here is a link to the full article, by Jordan Weissmann. The story is summed up in two graphs.

The first shows that in 2012, about 44% of working young college grads were “underemployed” – that is, working in jobs that did not require their degree. Not good. But actually, the same rate as 1994.

The second graph makes a distinction between “good” non-college jobs (not requiring a degree, but paying $45,000 a year) and “low-wage” non-college jobs, paying $25,000 a year or less. Of the 44% of all grads in n0n-college jobs (Graph 1), about 20% of them are in “low-wage” jobs — meaning 9% of college grads are in “low-wage” jobs.

As Weissmann points out, “In a sense, this reflects a shift in the  broader economy; for more than a decade now, middle-class jobs have given way to low-wage service work. And young college graduates haven’t been spared the change.”

Perfectly true. And in a sense, this means that universities can’t bear 100% of the responsibility for lousy job prospects for so many grads. But that still doesn’t help when you factor in those skyrocketing tuition fees. You can expect more pressure on universities to get into the real world. And that can only be a good thing.

 

Meet the “encore” entrepreneurs (you may be in for a surprise)

I’ve written before about the growing trend of Baby Boomers to start their own businesses. It’s an ideal way to handle the economic “triple threat” of today – real or potential job loss, under-funding for retirement, and lousy rate of return on those funds that have been set aside. Now comes evidence that women are outnumbering men as “encore entrepreneurs.”

This interesting report from BBC News cites data from a Kauffman Foundation study to show that the Boomer entrepreneurship trend is growing: in 2012, people aged 55-64 started 23.4% of all new businesses in the US, up from 14.3% in 1996.

But according to data from another source – Babson College – 10% of US women between 55 and 64 had taken steps to start their own business, compared to 7.5% for men.

The BBC story includes several interviews with women who have taken this step. The main reasons are what you’d expect – job loss, income reduction due to the recession, inadequate retirement funds. What’s different this time – compared to people of that same age in previous generations – is the perception that there is still time to turn things around, the willingness to start again, and the presence of a strong entrepreneurial mindset.

That – and a growing amount of support information and services. Books, seminars, consultancies – the trend to Boomer entrepreneurship has fueled an entire mini-industry of people (with real or self-proclaimed expertise) ready to help.

This is just the beginning. And it’s another nail in the coffin of “retirement at 65.”