I’ve come across several articles recently that have made me stop to ponder the issue of hiring really smart people in your organisation. Logically speaking, it seems natural to assume that you, as a business, want to hire the most talented people out there. The more brilliant the individual, so the theory goes, the more he or she can contribute in terms of ideas, expertise and ultimately growth and success.
But is there a sort of line beyond which intelligence actually becomes a curse?
According to some, there definitely is. Citing examples such as the Enron scandal, caused by a group of people considered to be "the smartest guys in the room", they argue that intelligence is often accompanied by a range of traits that actually make hiring such individuals a risky business. For starters, they argue, smart people are often keenly aware of their own intelligence, which often translates into arrogance and the automatic assumption that their opinions should be accepted without question. This means that they don’t tend to be particularly good at taking direction or working together with others – and they are often pretty lousy communicators, too.
A second and related issue is that being very gifted in terms of technical knowledge doesn’t necessarily make them skilled from a managerial point of view. But alas, the nature of career progression in most fields and companies means that talented individuals often end up being "rewarded" by being given management responsibilities. The result is that teams end up being led by individuals with no aptitude – and in many cases actually no interest – in being responsible for others and working in areas such as team-building, motivation and performance management. Using the example of retired sports superstars who try their hand at coaching, often with little success, some argue that highly skilled individuals are unable to bring themselves down to the level of the “lesser mortals”. This makes them unable to transfer useful knowledge, advice and skills to the individuals they are coaching, which translates into little difference being made to performance.
And retaining smart people can be a real headache too. They know they are considered highly attractive employees by other companies, which makes them think (perhaps rightly so) that they can expect better pay than others, as well as more benefits and higher bonuses. Now, this makes a lot of sense if their contribution to the company justifies that kind of payback. Or at least as much sense as the fact that CEOs have seen pay increase by 937% since 1978 while the typical worker has seen an increase of 10.2%. But if their superior brainpower translates into an uncommunicative, dictatorial approach that focuses purely on numbers and data, without being able to comprehend the more human aspects of consumer behaviour, is it really worth it? Some data would actually suggest that it is the less-confident among us who end up being more productive and successful.
On a slightly unrelated note, another article I recently came across has made me wonder whether a little laziness could actually help me reach my goals more effectively. What I mean is that I grew up being told that hard work pays off, which has made me a fairly committed and dedicated worker. But apparently that may not be a particularly good thing either, because it means that I’m not afraid of having to do a lot of work to get somewhere. On the other hand, a lazy person tries to find the quickest way possible, which actually means they end up reaching the goal sooner. A little someone named Bill Gates actually famously said: "I will always choose a lazy person to do a difficult job, because he will find an easy way to do it".
And if it works for Gates, who are we to disagree?
While there may not be enough evidence to say that spending long hours in front of a screen causes worsening eyesight, research has repeatedly shown that a high proportion of DSE (Display Screen Equipment) workers regularly complain of eye discomfort.
It is no surprise, then, that the Health & Safety Executive has had legislation in place since 1992 to protect workers’ health in this regard, primarily by ensuring that workplaces take appropriate precautions & measures to protect users’ eyesight against prolonged computer screen use. But a recent study conducted by Specsavers Corporate Eye Care has revealed that many companies are in fact failing to comply with these regulations. To be more precise, HR Grapevine reports that a worrying 63% of the 138 heads of companies questioned (all with Health & Safety or HR responsibilities) admitted not fully understanding the DSE regulations, in what appears to be continuing confusion regarding who is responsible for what.
To shed some light on the matter, the regulations state that you, as an employee, "are entitled to ask your employer to provide an eye test if you are an employee who habitually uses DSE as a significant part of your normal day to day work. This is a full eye and eyesight test by an optometrist (or a doctor)."
But in spite of this, 10% of the managers surveyed by Specsavers expected the employee to fund the full extent of their own DSE eye care. The question is: are we dealing with a lack of information or awareness, or are employers being disingenuous and simply trying to cut costs?
There is no doubt that, for companies with thousands or tens of thousands of employees, the costs will add up. But it is also true that they stand to lose a lot more by feigning ignorance and/or not meeting their obligations. Not just in terms of potential legal action on the part of employees (though this is definitely a possibility), but in the sense that no one stands to benefit from having unhealthy employees. If your employees suffer, your business will suffer too – so is it really worth the risk?
It’s an age-old question with no straightforward answer that I, for one, have often asked myself in the past. We know that skills and qualifications are good, but how will they actually help us get a better job or a better salary? How much of a competitive edge do they give us? How much do employers really care if you have a BA, MA, BTEC, PhD or NVQ?
Well, a lot of them probably do care, or at least would if they knew what BTEC actually stands for. And, as revealed by a recent study conducted by City & Guilds, a lot of them actually don’t. When questioned, 57% of the 1,000 employers surveyed admitted that they find many acronyms they see on CVs confusing, with almost two-thirds (64%) revealing that they had to look them up on the internet. Let’s look on the bright side: at least they try… which can’t really be said for the the odd 40% who admitted throwing CVs away because they didn’t understand the qualifications the candidates have.
Worryingly, 44% of respondents – who came from a combination of small, medium and large businesses – didn’t know that BA stands for Bachelor of Arts, while a whopping 95% weren’t able to identify the more advanced qualification from a list including BTECs and NVQs. The problem seems to be that these acronyms generate suspicion rather than awe, suggesting to most employers (around two thirds) that the candidate is doing it to cover up a lack of skills or qualifications.
Clearly, this is a big problem for employers, but an ever bigger one for those of us who are slaving away trying to certify our skills only to find that it makes us less likely to get a job.
As mentioned by Chrissie Maher OBE, founder and director of Plain English Campaign:
“… it’s not just potential employers who lose out: job seekers could be wasting years of hard work on qualifications that employers won’t recognise.”
But let’s not tear up our BAs just yet. City and Guilds is working in collaboration with the Plain English Campaign to create a jargon buster that will help both employers and learners better understand the terms and acronyms that exist in education. Hopefully this will create greater awareness and better possibilities for all parties involved. In the meantime, if you are hunting for a new job, you may just want to double check you are writing out your qualifications in full… just in case.
Last week we reported on a piece of research showing that UK professionals, while secure in their sense of financial stability, are not particularly happy with their working situation.
Bearing this in mind, you would think that they spend a reasonable amount of time researching potential employers, right?
According to a new study conducted by jobs and careers community Glassdoor, full-time UK employees spend a measly four hours on this kind of research every year, out of a total of approximately 1,680 hours spent on the job. That’s 0.2% of their time. On the other hand, they spend an average of 24 hours – 6 times as long – researching their annual holiday.
Fair enough. Holidays are important and we can probably all appreciate that when you don’t even know if a company will be reading your CV or giving you an interview, perhaps dedicating hours on end to research may be a complete waste of time (though it would almost certainly increase your chances of getting an interview to begin with).
But what of the 35% of UK employees who admitted not spending any time researching their new employer before accepting a job offer? It’s one thing to avoid researching new employers when you don’t know if they will pay you any attention or you are not really sure that you want to change job anyway, but it’s arguably quite another to not do that kind of research when you decide to actually accept a new position. What if the job isn’t what you thought it would be? What if the company is going bankrupt? What if…?
The numbers here are certainly more worrying. It seems that 55% of the 1,031 employees interviewed didn’t look at the employer’s website and 78% didn’t bother to check if the business was making a profit. Can you really complain, then, when 6 months in you realise the job isn’t what you thought it would be?
It’s certainly true that looking for a job can be a full-time job in and of itself. It’s not an easy thing to manage when you’re supposed to be, well, working for the person who is currently paying your wages, remember? A lot of people have been there, including myself. But I’ve also been the person who finally quit a job I had come to loathe, only to find that the new job I had taken was even worse than the one before! It’s really not worth it and can make your sense of purpose even shakier than before. As someone who finds purpose pretty important, I would strongly encourage anyone reading to give up at least some of their holiday research time to making sure any potential career change they embark on is a change for the better – or be prepared for a very unhappy 1,680 hours of the year ahead!
Happiness is a complex thing. In most cases, it takes more than a great job or a loving family – it’s the combination of these and endless other factors that contribute to how a person feels overall.
So when it comes to measuring individual satisfaction with these aspects, how do British employees compare to their European counterparts?
The latest edition of the Gallup and Healthways State of Global Well-Being Index provides an insight into this tricky question. This year, the team of researchers probed 133,000 individuals from all over the world about 5 crucial areas of their lives:
Based on their responses, individuals were then categorised into three different groups for each metric: thriving, struggling or suffering. According to this scale, Britons were shown to have high levels of financial well-being, with 46% falling into the "thriving" category compared with a European average of 37% and a global average of 25%. This appears to indicate that, overall, Britons feel secure when it comes to their finances and don’t tend to live in fear of getting to the end of the month or having enough money to pay their skills.
At the same time, 51% of British respondents were ranked as "struggling" in the purpose category, suggesting that a large proportion of them are unhappy in their professional roles. And they are not alone – purpose was the pillar that showed the lowest global averages of “thriving” individuals at 18%.
But if people feel financially secure, shouldn’t it follow that their sense of professional purpose are also quite high? Well, not necessarily. According to research director Dan Witters:
As the country’s employment situation improved, it’s possible that many job seekers took the first available position they could get, without regard for whether the job was a good fit for their talents or long-term goals.
And that’s understandable: being stuck in a role that is not necessarily your cup to tea can’t be particularly good for your motivation levels. Even if you are committed to your job and do it well, enjoying what you do is a huge motivating factor that makes a big difference to how much you are able to achieve & contribute. What can be said is that, if Britons do in fact feel secure about their finances, it is never too late to invest in gaining the skills or know-how needed to go from doing a job that is ok to doing a job that you love. What is at stake is not (only) the productivity of businesses across the country, and the economy as a whole, but your personal sense of well-being when you wake up to go to work on a Monday.
We spend an awful lot of our time working, after all – why not get the most out of it?
The fact that women continue to be disadvantaged in the world of business with respect to their male counterparts is nothing new. Despite significant and consistent advances in terms of female representation on company boards and the percentage of women filling senior positions, the unfortunate truth continues to be that, broadly speaking, the opportunities available to men and women in workplaces are far from equal.
In this context, the role of training and skills is often the cause of much debate, particularly when it comes to the issue of merit and the extent to which women are less skilled. More specifically, is it that women are not as qualified as men to perform certain roles, or that they are not given the same opportunities even though they do, in fact, possess the same skillset and/or experience?
According to a report issued last week by the UK Commission for Employment and Skills (UKCES), women are already ahead when it comes to qualifications, with 38% holding degree-level certifications compared with 36% of men. What’s more, this advantage is set to pick up speed with 49% of women and only 44% of men predicted to hold degree-level qualifications in 2020. Importantly, the prediction is that this gap will result in two thirds of the new high-skill jobs created going to women in the next 6 years.
Commenting on the results, General Secretary of the TUC and UKCES Commissioner Frances O’Grady said:
The fact that skills levels are predicted to increase is welcome news. Skills matter – they increase a worker’s pay, their job satisfaction and boost the economy.
The increased disparity between men’s and women’s skill levels is concerning for both sexes. Men are finding to harder to get skilled jobs, while for many women their higher qualifications are not leading to better pay and jobs.
In other words, the situation is complex and affects both sexes, though perhaps to different extents. Indeed, with so many factors coming into play in recruitment and selection processes, both externally when taking new people on and internally when deciding who to promote or shift to a different position, it is perhaps idealistic and naïve to assume that higher qualifications will ever be directly correlated with better jobs or better pay.
What we can certainly hope is that more training and better skills will contribute to a more vibrant economy overall, which, in combination with a continued shift in attitudes when it comes to gender equality, will give skilled professionals (both men and women) greater opportunities for job satisfaction and career advancement than ever before.
At Findcourses.co.uk, we believe in the power and value of professional development. There wouldn’t be much point in doing what we do if we didn't.
But is there such a thing as too much training?
I ask the question after reading about the Italian doctor who made the headlines last week after an inquiry was launched into his rather impressive working record as a state employee: 15 days in 9 years.
Now, I know what you’re thinking. It’s Italy. Those guys never do much but ride around on their scooters and eat good food. But while there is some truth to that (I am Italian, which gives me the right to say these things), what interested me more were the reasons he had for staying out of the workplace for so long. Aside from the one year of paid family leave and the extended period of sick leave, what took up most of his time were two university training courses that appear to have taken him a total of 8 years to complete. The first lasted from when he started work in 2005 until November 2008 (around 3 years), while the second began upon his return from sick leave in July 2009 and lasted until June this year (around 5 years).
Eight years is a long time. Especially considering that a doctor presumably has to have a degree in Medicine to even get a job as a doctor in the first place. Then again, is it that unreasonable for a 50-year old professional to feel that he needs to update his skills and knowledge, particularly in the rapidly evolving field of medicine?
Leaving aside the fact that this gentleman may have been using training as an excuse to not have to work, I think this story raises some interesting questions about how much training professionals actually need, as well as to what extent it is the employer’s (in this case the state’s) duty not just to allow for it, but to sponsor it in terms of both time and money.
According to the hospital director, all of the doctor’s actions fell within the rules. Is it just a case, then, of dealing with a poorly designed system? What we don’t know, as of now, is if and to what extent this training will pay off in his future work (presuming he is allowed to keep his job). If it were to make a significant difference to the number of lives this doctor is able to save, for example, would this be worth the investment made? And how would an organisation go about measuring that?
These are all tough questions that L&D professionals deal with on a regular basis. The key here is the issue of measuring results, and the challenge of finding ways to do this as effectively and objectively as possible. What we need are structured, transparent systems that highlight the benefits (or lack thereof) of training in a way that makes it easier to either justify or veto further investments. My suspicion is that, while all businesses will find that there are indeed tangible benefits to be had by training their employees, some will come to the conclusion that an 8-year absence is ludicrous, while others may actually find that it’s worth it.
After all, il mondo è bello perché è vario.
Virus outbreaks tend to inevitably fuel debates about the pharmaceutical industry. And indeed, the current Ebola situation is no exception.
As was the case with bird flu, swine flu and indeed numerous other epidemics, many thorny issues arise and many buttons are pushed when it comes to the availability of potentially life-saving medicines, their cost and their distribution. This in turn stimulates reflection on big pharma in general.
Let’s face it: pharma hasn’t historically had a great reputation. Though pharmaceutical companies work essentially to help people overcome disease, often when they risk losing their lives, there are a number of reasons for which they are often interpreted to be the bad guy. Among these, profit is probably at the top of the list. Many – including the UK’s top public health doctor with regards to Ebola – blame the industry for failing to make investments where returns are too small to justify spending any money to begin with. The trouble is that an industry as necessary and as rich as pharma has historically always had the freedom to reason that way, where a lot of other traditionally "bad" industries have been forced to adapt and act in a way that is perhaps interpreted as more "moral".
Now, the point of this post is not to make a judgement on whether these criticisms are well-founded or not – that would be way beyond our remit and my own personal area of comfort. The point is that I want to make is that right or wrong, good or bad, it seems that even pharmaceuticals are reaching a cross-roads and will have to make some tough decisions in order to remain competitive in the next 5 to 10 years. In fact, PwC’s Pharma 2020 Report warns that “The outlook [for the pharmaceutical industry] has never seemed more promising – or more ominous”.
More specifically, the industry faces big challenges in terms of increasing what have been very low levels of scientific productivity over the past 10 years. In addition, according to the report, pharmaceutical companies need to address the "cultural sclerosis" that is affecting the management culture, mental models and strategies on which the industry continues to rely. Basically, they need to wake up and realise that there are new ways of doing business, and start taking practical measures to break with the "business as usual" approach that they have been lucky enough to get away with until now.
And this reality is in turn inextricably linked to the skills question and the potential for economic growth. In the UK alone, the pharmaceutical sector has continuously generated a greater trade surplus (at £5 billion) than any other industrial sector in the UK. But skills are lacking and the Association of the British Pharmaceutical Industry (ABPI) warns that better regulation and a more science-focused education system will be needed to meet demands and make the change that is so desperately needed possible.
It is in this context that here at Findcourses.co.uk we recently launched a dedicated Pharmaceutical category, in the hope that it will become one of our biggest and more diverse categories in the future – relevant to professionals at all levels and in all sub-disciplines of to the industry. Whatever you may think of pharma, it is clear that it is a sector in dire need of talented professionals, and we reckon making training opportunities as accessible as possible will, in the end, work to everyone’s advantage.
Worried about the amount of e-mails you will find in your inbox when you return from your holidays? Considering the pace of the modern working environment, combined with our now heavy reliance on e-mail as a form of communication, you’d be suffering from quite a severe case of denial if you weren’t. But there is equally no denying that this is a problematic state of affairs, marring the entire holiday experience with a sense of dread and making it even more difficult for professionals to really "switch off" and relax.
In response to this dilemma, automotive company Daimler has taken action and developed an interesting, if a little unconventional, new approach to dealing with e-mails received while an employee is on annual leave: deleting them.
Yes, any client or colleague trying to contact a member of staff who is on leave will receive a simple automated reply stating:
"I am on vacation. I cannot read your email. Your email is being deleted."
Simple as that. And far from an empty threat, this reply is generated by a software (more or less ironically called ‘Mail on Holiday’) that actually deletes the email in question, making it impossible for the 100,000 or so employees taking part in the programme to ever know of its existence.
The move comes in response to the results of a government-funded research project on work-life balance that was conducted in collaboration with the University of Heidelberg in 2010-2011. Managers in the company are now trained specifically to promote a good work-life balance, encouraging employees to take time off for any extra hours worked as well as set aside time during which meetings can’t be scheduled. And the auto-delete function is essentially an extension of this principle that workers deserve real time off, particularly as they don’t actually tend to perform any better by working extra hours. On the contrary, countries that discourage overtime are in many cases more productive, with Germany and France – which recently banned after-hours communication in a number of sectors – being prime examples.
Of course, the competition was quick to judge and claim that such a system is not workable in a client-focused industry. But Daimler spokesman Oliver Wihofszki was adamant that: "The response is basically 99% positive, because everybody says, That's a real nice thing, I would love to have that too".
To be quite fair to the masterminds at Daimler, the automated replies do include the contact information of other colleagues to contact for urgent matters, so it is not the case that clients are left entirely to their own devices during the holiday season. Though some may indeed find it a little irritating, or perhaps even cheeky, I don’t find it difficult to believe that the majority would actually be more envious than anything else! All I can say for the moment is that it seems like a great way to win favour among colleagues while simultaneously drawing attention to their brand. For that, at least, I think Daimler should be congratulated.
Yesterday’s release of A-Level results has created a huge amount of buzz and debate across the web regarding the real value of going to university vis-à-vis opting for a vocational route to employment.
News sites and social media, particularly Twitter, have been rapidly filled with a multitude of different articles and messages that draw attention to the ways in which attitudes are changing with respect the infamous academic vs. vocational debate. What I mean by this is that, in and among the good luck wishes and congratulatory statements, there seemed to be an equally significant number of calls for students – regardless of their results – to really consider their options when it comes to the next step. From job offers to apprenticeship vacancies and further study options, the #notgoingtouni camp seems to be in full gear.
A few striking examples can be found in the following tweets:
Department for Work and Pensions
Just left education? Thinking of working in the UK #tourism industry? Find apprenticeships here #getbritainworking
The CISI (Chartered Institute for Securities and Investment)
Looking for an alternative to uni? What about a financial services #apprenticeship? #alevelresults #notgoingtouni
Considering the Government’s great focus on vocational education, this is perhaps not overly surprising. With university becoming increasingly expensive, the economy recovering and study after study warning of growing skills gaps in key sectors, it arguably couldn’t be a better time to stress the need for hands-on training that leads to competence for a specific job role rather than the broader, knowledge-based study that often characterises a university degree. What will be interesting is seeing if these vocational programmes are successful not only in providing more skills (rather than knowledge), but rather the right kind of skills, which is ultimately what will determine if the UK’s skills gaps can be reduced.
As a final reflection, if this trend picks up speed, what will happen to the traditional university degree? Could it be the case that, as that as roles shift and degree programmes become more flexible and easier to integrate with a career, it will be seasoned professionals who start opting for the BAs and MAs while school-leavers take the vocational route? It may sound illogical, but this would perhaps mean getting into a career more quickly and with more practical preparation, to then have the time later on to complement existing experience and expertise with a University-style programme. Would this actually create more well-rounded professionals overall?
Discover 4 important truths about qualifying to perform controlled fuctions in an FCA-approved firm.