Royston Morgan

The Keyword Density of Non-Sense – by Dr E. Garcia

On March 24 a FINDALL search in Google for keywords density optimization returned 240,000 documents. I found many of these documents belonging to search engine marketing and optimization (SEM, SEO) specialists. Some of them promote keyword density (KD) analysis tools while others talk about things like “right density weighting”, “excellent keyword density”, KD as a “concentration” or “strength” ratio and the like. Others even take KD for the weight of term i in document j, while others propose localized KD ranges for titles, descriptions, paragraphs, tables, links, urls, etc. One can even find some specialists going after the latest KD “trick” and claiming that optimizing KD values up to a certain range for a given search engine affects the way a search engine scores relevancy and ranks documents.

Given the fact that there are so many KD theories flying around, my good friend Mike Grehan approached me after the Jupitermedia’s 2005 Search Engine Strategies Conference held in New York and invited me to do something about it. I felt the “something” should be a balanced article mixed with a bit of IR, semantics and math elements but with no conclusion so readers could draw their own. So, here we go.

Background.

In the search engine marketing literature, keyword density is defined as

Equation 1

where tfi, j is the number of times term i appears in document j and l is the total number of terms in the document. Equation 1 is a legacy idea found intermingled in the old literature on readability theory, where word frequency ratios are calculated for passages and text windows – phrases, sentences, paragraphs or entire documents – and combined with other readability tests.

The notion of keyword density values predates all commercial search engines and the Internet and can hardly be considered an IR concept. What is worse, KD plays no role on how commercial search engines process text, index documents or assign weights to terms. Why then many optimizers still believe in KD values? The answer is simple: misinformation.

If two documents, D1 and D2, consist of 1000 terms (l = 1000) and repeat a term 20 times (tf = 20), then for both documents KD = 20/1000 = 0.020 (or 2%) for that term. Identical values are obtained if tf = 10 and l = 500.

Evidently, this overall ratio tells us nothing about:

1. the relative distance between keywords in documents (proximity)

2. where in a document the terms occur (distribution)

3. the co-citation frequency between terms (co-occurrence)

4. the main theme, topic, and sub-topics (on-topic issues) of the documents

Thus, KD is divorced from content quality, semantics and relevancy. Under these circumstances one can hardly talk about optimizing term weights for ranking purposes. Add to this copy style issues and you get a good idea of why this article’s title is The Keyword Density of Non-Sense.

The following five search engine implementations illustrate the point:

1. Linearization

2. Tokenization

3. Filtration

4. Stemming

5. Weighting

Linearization.

Linearization is the process of ignoring markup tags from a web document so its content is reinterpreted as a string of characters to be scored. This process is carried out tag-by-tag and as tags are declared and found in the source code. As illustrated in Figure 1, linearization affects the way search engines “see”, “read” and “judge” Web content –sort of speak. Here the content of a website is rendered using two nested html tables, each consisting of one large cell at the top and the common 3-column cell format. We assume that no other text and html tags are present in the source code. The numbers at the top-right corner of the cells indicate in which order a search engine finds and interprets the content of the cells.

The box at the bottom of Figure 1 illustrates how a search engine probably “sees”, “reads” and “interprets” the content of this document after linearization. Note the lack of coherence and theming. Two term sequences illustrate the point: “Find Information About Food on sale!” and “Clients Visit our Partners”. This state of the content is probably hidden from the untrained eyes of average users. Clearly, linearization has a detrimental effect on keyword positioning, proximity, distribution and on the effective content to be “judged” and scored. The effect worsens as more nested tables and html tags are used, to the point that after linearization content perceived as meritorious by a human can be interpreted as plain garbage by a search engine. Thus, computing localized KD values is a futile exercise.

Burning the Trees and Keyword Weight Fights.

In the best-case scenario, linearization shows whether words, phrases and passages end competing for relevancy in a distorted lexicographical tree. I call this phenomenon “burning the trees”. It is one of the most overlooked web design and optimization problems.

Constructing a lexicographical tree out of linearized content reveals the actual state and relationship between nouns, adjectives, verbs, and phrases as they are actually embedded in documents. It shows the effective data structure that is been used. In many cases, linearization identifies local document concepts (noun groups) and hidden grammatical patterns. Mandelbrot has used the patterned nature of languages observed in lexicographical trees to propose a measure he calls the “temperature of discourse”. He writes: “The `hotter’ the discourse, the higher the probability of use of rare words.” (1). However, from the semantics standpoint, word rarity is a context dependent state. Thus, in my view “burning the trees” is a natural consequence of misplacing terms.

In Fractals and Sentence Production, Chapter 9 of From Complexity to Creativity (2, 3), Ben Goertzel uses an L-System model to explain that the beginning of early childhood grammar is the two-word sentence in which the iterative pattern involving nouns (N) and verbs( V) is driven by a rule in which V is replaced by V N (V >> V N). This can be illustrated with the following two iteration stages:

0 N V (as in Stevie byebye)

1 N V N (as in Stevie byebye car)

Goertzel explains, “-The reason N V is a more natural combination is because it occurs at an earlier step in the derivation process.” (3). It is now comprehensible why many Web documents do not deliver any appealing message to search engines. After linearization, it can be realized that these may be “speaking” like babies. [By the way, L-System algorithms, named after A. Lindermayer, have been used for many years in the study of tree-like patterns (4)].

“Burning the trees” explains why repeating terms in a document, moving around on-page factors or invoking link strategies, not necessarily improves relevancy. In many instances one can get the opposite result. I recommend SEOs to start incorporating lexicographical/word pattern techniques, linearization strategies and local context analysis (LCA) into their optimization mix. (5)

In Figure 1, “burning the trees” was the result of improper positioning of text. However in many cases the effect is a byproduct of sloppy Web design, poor usability or of improper use of the HTML DOM structure (another kind of tree). This underscores an important W3C recommendation: that html tables should be use for presenting tabular data, not for designing Web documents. In most cases, professional web designers can do better by replacing tables with cascading style sheets (CSS).

“Burning the trees” often leads to another phenomenon I call “keyword weight fights”. It is a recurrent problem encountered during topic identification (topic spotting), text segmentation (based on topic changes) and on-topic analysis (6). Considering that co-occurrence patterns of words and word classes provide important information about how a language is used, misplaced keywords and text without clear topic transitions difficult the work of text summarization editors (humans or machine-based) that need to generate representative headings and outlines from documents.

Thus, the “fight” unnecessarily difficults topic disambiguation and the work of human abstractors that during document classification need to answer questions like “What is this document or passage about?”, “What is the theme or category of this document, section or paragraph?”, “How does this block of links relate to the content?”, etc.

While linearization renders localized KD values useless, document indexing makes a myth out of this metric. Let see why.

Tokenization, Filtration and Stemming

Document indexing is the process of transforming document text into a representation of text and consists of three steps: tokenization, filtration and stemming.

During tokenization terms are lowercased and punctuation removed. Rules must be in place so digits, hyphens and other symbols can be parsed properly. Tokenization is followed by filtration. During filtration commonly used terms and terms that do not add any semantic meaning (stopwords) are removed. In most IR systems survival terms are further reduced to common stems or roots. This is known as stemming. Thus, the initial content of length l is reduced to a list of terms (stems and words) of length l’ (i.e., l’ < l). These processes are described in Figure 2. Evidently, if linearization shows that you have already “burned the trees”, a search engine will be indexing just that.

Similar lists can be extracted from individual documents and merged to conform an index of terms. This index can be used for different purposes; for instance, to compute term weights and to represent documents and queries as term vectors in a term space.

Weighting.

The weight of a term in a document consists of three different types of term weighting: local, global, and normalization. The term weight is given by

Equation 2

where Li, j is the local weight for term i in document j, Gi is the global weight for term i and Nj is the normalization factor for document j. Local weights are functions of how many times each term occurs in a document, global weights are functions of how many times documents containing each term appears in the collection, and the normalization factor corrects for discrepancies in the lengths of the documents.

In the classic Term Vector Space model

Equation 3, 4 and 5

which reduces to the well-known tf*IDF weighting scheme

Equation 6

where log(D/di) is the Inverse Document Frequency (IDF), D is the number of documents in the collection (the database size) and di is the number of documents containing term i.

Equation 6 is just one of many term weighting schemes found in the term vector literature. Depending on how L, G and N are defined, different weighting schemes can be proposed for documents and queries.

KD values as estimators of term weights?

The only way that KD values could be taken for term weights

Equation 7

is if global weights are ignored and the normalization factor Nj is redefined in terms of document lengths

Equation 8

However, Gi = IDF = 1 constraints the collection size D to be equal to ten times the number of documents containing the term (D = 10*d) and Nj = 1/lj implies no stopword filtration. These conditions are not observed in commercial search systems.

Using a probabilistic term vector scheme in which IDF is defined as

Equation 9

does not help either since the condition Gi = IDF = 1 implies that D = 11*d. Additional unrrealistic constraints can be derived for other weighting schemes when Gi = 1.

To sum up, the assumption that KD values could be taken for estimates of term weights or that these values could be used for optimization purposes amounts to the Keyword Density of Non-Sense.

References

The Fractal Geometry of Nature, Benoit B. Mandelbrot, Chapter 38, W. H. Freeman, 1983.

From Complexity to Creativity: Computational Models of Evolutionary, Autopoietic and Cognitive Dynamics, Ben Goertzel, Plenum Press (1997).

Fractals and Sentence Production, Ben Goertzel, Ref 2, Chapter 9, Plenum Press (1997).

The Algorithmic Beauty of Plants, P. Prusinkiewicz and A. Lindenmayer, Springer-Verlag, New York, 1990.

Topic Analysis Using a Finite Mixture Model, Hang Li and Kenji Yamanish.

Improving the Effectiveness of Information Retrieval with Local Context Analysis, Jinxi Xu, W. Bruce Croft.


© Dr. E. Garcia. 2005

http://www.e-marketing-news.co.uk

How to do a SWOT Action Analysis

SWOT (Strengths Weaknesses Opportunities and Threats) Analysis is a simple but surprisingly effective technique to assess an organisations positioning and begin the process of turning general ideas for market growth into actionable activities. This brief guide shows how to extend the simple SWOT concept into a tool for defining the actions needed to deal with external threats and internal weaknesses in the organisations capabilities.

The process is best done within a workshop concept. So organise a team meeting of around 7 to 10 interested parties who are experts or knowledgeable in the domain to be considered.

The process:

Step 1 – First agree the area to be considered and the core assumptions. For example ‘we will consider the Softhouse organisation and the opportunities to grow the market in the States.

Step 2 – Use a brainstorming technique such as nominal group and ask the team first to think about the area we have chosen and what the issues in delivering this approach are. They write down (on their own) what could be the barriers or carriers to entering the new market in the States onto post-it notes or just make a list on paper before them.

Step 3 – They place their post it notes (or the facilitator) in turn onto the grid as shown in the diagram below – barriers to threats and carriers to opportunities.

Step 4 – Brainstorm as in step 2 and consider the organisation (Softhouse) and what are its unique strengths or capabilities and its weaknesses. The team on their own write down onto post-it notes their ideas as before.

Step 5 – They place their post-it notes (or the facilitator) in turn onto the grid as shown in the diagram – strengths to strengths and weaknesses to weaknesses.

Step 6 – The team then consider the crossing points of the SWOT for example between Threats and Strengths below (top left box) and as shown in the diagram think of specific actions to use strengths to counter any threats. These are written onto post it notes as before and placed in turn into the grid.

Step 7 – The facilitator tidies up the board removing duplicates or clarifying actions that have been written down. The board actions are then agreed prioritized then transferred to a standard action plan template.

Table One SWOT action analysis

Example Swot Action Analysis

Here is an example taken from an early draft of a business plan to illustrate the completed board. From here the actions can be taken across to a standard action plan template and owners and timescales applied. Thus from an initial consideration of the external and internal environment we can quite quickly move to a position where we can see possible practical actions we can take to move the agenda forward.

Royston

Give and effective conference speech that holds attention

Giving an conference address

I have sat through and given a few presentations in my time so based on my experience of sitting through a conference or two I have put together a few tips:

Preparing For The Event

  • Read the proposed conference flyer and match your points to the theme. I sat through an interesting presentation the other-day that left me and the people on my table mystified as to how it fitted in the theme of the conference (it was good though).
  • The flyers can help on the direction of the content – it is always a good idea to discuss the content further with the Conference Producer before you prepare ‘it’.
  • Cicero over two thousand years ago said a good speaker learns fast and is knowledgeable and is an expert about the subject – know your subject in depth and provide evidence during your speech that you know what you are talking about.

Content

If you are speaking at a conference attracting senior-level decision-makers from across your sector ask yourself:

  • What do they want to hear?
  • What do you want to say?
  • Where does the crossover lie?

Watch out! – Presentations from speakers who dwell too long on their basic company information are always seen as crude sales pitches – and people switch off (No more than who you are and what you do please).

Be aware of the format of your session

If you are doing a presentation and are using PowerPoint:

  • Use a minimum font size of 18 – better 24+
  • Allow around three minutes per slide (remember no death by PowerPoint!).
  • The Rule of Five – ideally PowerPoint presentations should contain no more than 5 words per sentence and 5 lines per slide (actuall no words is better just a few pictures).
  • Visuals are often a great way of illustrating your presentation but ‘Keep It Simple’ – too many charts overwhelm a presentation and cannot be read at the back of the conference room.
  • Likewise, avoid over-use of PowerPoint special effects – or flash effects like zooming they distract from the presentation

If you are taking part in a panel discussion prepare:

  • The Chair should contact you approximately 2 weeks in advance of the panel to set the agenda – schedule time to talk to her!
  • You are likely to be asked to spend five minutes setting out your thoughts on the proposed topic.
  • Prepare and memorise this five minute piece and think carefully about what you are going to say (Cicero also recommended memorising your speech).

Practice makes perfect

Rehearse your speech several times preferably in front of an audience who will not fall asleep and who are honest.

And on the day…

…start strong

It is often helpful to memorise the first minute or two of your speech to ease you into it – once you’ve started you’ll find it easier to keep going. Never apologise or spend too much time on inane pleasantries – get down to business. The first minute or two is about establishing the rapport with the audience and setting the degree to which they give you authority to speak.

Think about your body language

  • Style and tone of voice account for 90 per cent of communication so adopt a relaxed, confident pose.
  • Maintain eye contact with the audience – select one or two people from the audience to maintain contact but do not stare!
  • If there are label mics available use them – no Al Jolson impressions and shout at them!

Timings

Watch your timing, never overun and finish a few minutes to ask for any questions

Closing note: On the question of number of words on a slide. Keep this to a minimum and if possible none at all other than an intro slide ‘who you are’. I was at a resilience conference a few months back and we had an address by a very senior woman from the States who used no slides at all (or notes) and she held the audience riveted by her authority on the topic. There was a hand-out at the end for notes but for the duration of the address there were no distractions and we were able to follow the logic closely.

Prince Charles treats us to more nonsense

The Deathly GM Crops and The Half-Wit Prince (Book 8)

Most of the time I regard Prince Charles as an amiable affable buffoon who talks a peculiar brand of new age sentiment and claptrap and dresses in a quaint Scottish (kilt commando style) way so beloved by our American friends across the water or who swans down the racecourse in top hat waving to the assembled masses on the rails. This erstwhile Edwardian who I think at heart harks back to those times when obedient yokels tilled the fields from dawn to dusk and tipped a respectful forelock in his ‘ighness’ direction as he swept by in his carriage to the big house (god bless yer guv) and people knew their place and the beautiful class structure of the realm stood in all its glory whilst he sat at the top of the pile as king (eh not yet the Queen is still very much hanging on ed.) with his subjects arraigned about him.

Now on the subject of GM crops (and about time too!!) HH has actually managed to hit a few (very few) good points but what surprised me about this whole issue was that a national newspaper gave his non-scientific bar room opinions front page coverage. I was actually about to buy a copy of the Telegraph to peruse on the train when I saw he was the lead for the day – this forced me to buy the broadsheet version of the Socialist Tribune (the Guardian) as a substitute so dear readers you can guess this was a serious setback.

As always I am interested in the purpose of these things and not in the content per se for if I want to hear some claptrap I can always talk to my pocket memo for five minutes then play it back. The point it seemed to me was to position Charles as next ruler and restate the inevitability of a continuation of the stultifying class structure we have in this country with the Windsor’s at its head. Demonstrating that he has thoughtful and erudite opinions (ok that didn’t work ed.) and in an unquestioning way accept and parade his views before the public. Also the writer sprinkled the article with discourses of justification of why this was an important piece due to the role HH would play as future monarch etc etc – not questioning the reasoning behind this rationalisation at all.

Often it is refreshing for the basis behind some scientific advances to be critically reviewed as to their consequences and costs – the debate about cloning being an example where there is not much understanding so very little control. GM crops are a potential benefit to society as a whole at least in the third world where they don’t have the luxury of choosing ‘organic’ or otherwise as we do in comfortable wet UK – and drought resistant strains of wheat may indeed be a breakthrough for them – and of course there are always the agribusiness monopolies wanting to maximise their profits which should be monitored. So there is a basis for debate which is underway but these more thoughtful insights do not get airtime or the grounds of critical debate are undercut by poorly informed half understood issues expounded for purposes of publicity and positioning of a future king.

Royston

Snippet from the Web

Lord Robert Winston, Imperial’s famously moustachioed professor of fertility studies seems to have got himself in a trouble over his comments relating to critics of GM technology.
In a speech at Whittington Hospital (somewhere in North London, apparently) a while back, the celebrity ICSM Prof spoke out against those who criticise any kind of genetic manipulation, saying that many protests were “ill-advised”. He was particularly forthright on Prince Charles, whom he called one of “the most genetically modified people around”.

Government behaving badly on outsourcing contracts

Many of the problems in government outsourcing result from bad behaviour

The boss of outsourcing giant Serco has accused the Government of “behaving badly” by passing off unreasonable contracts to suppliers, ignoring its own guidelines and shrouding its decisions in secrecy. In a Commons hearing on lessons learned from the collapse of Carillion, chief executive Rupert Soames told MPs that a raft of “well run and well respected” outsourcers have lost vast amounts of money in recent years working on government contracts with “unmanageable amounts of risks”.

Mr Soames – a grandson of Sir Winston Churchill – claimed the Government has previously tried to pass off controversial and “unreasonable” contracts to outsourcing firms, while also routinely expecting suppliers to shoulder the risk of major law and policy change.

The recent woes in the outsourcing sector, which led to the collapse of Carillion and forced a number of its rivals to raise emergency capital to bolster their finances, was “astonishing”. “It’s been a massive, massive disruption in the supplier sector, the likes of which I’ve never seen – £8 billion written off of the supplier sector and billions of pounds being raised to recapitalise.” He added: “A lot of this is management’s fault, but … the Government as a monopoly buyer cannot stand idly by and say ‘nothing to do with me, Gov’.”

 

Mitie chief executive Phil Bentley, who was also giving evidence in the hearing, told MPs on the Public Administration and Constitutional Affairs Committee that he believed inaccurate data was also to blame for some failed outsourced contracts and called for greater data sharing and transparency. He gave the example of the asylum seeker contract handled by Serco, which he said saw the numbers of asylum seekers “massively underestimated” and led to hefty losses on the work.

Both bosses also said the bidding process was also flawed, with the Government under pressure to choose the cheapest supplier, rather than focusing on quality and expertise. Mr Soames added there are “no benefits for good behaviour, and no penalties for bad behaviour” in the process.

The company chiefs said the Government had tried to pass on the extra cost of the national living wage on some contracts, while also expecting suppliers to take the hit from any future policy changes from Brexit law changes.

 

Article source: https://www.eveningexpress.co.uk/news/business/government-behaving-badly-on-outsourcing-contracts-says-serco-boss/

Cielo is a Leader in recruitment process outsourcing

“As we continue to explore new frontiers in technology, extend our reach to new places around the world and break new ground in the candidate and client experience, we remain committed to maintaining the high quality of service our clients expect from us,” said Sue Marks, Cielo’s Founder and CEO. “Once again being recognized as a Leader by Everest Group and their peers in the analyst community shows sustained excellence even as we focus on growth and plan for future success in a fast-changing market.”

Everest Group’s 2018 Recruitment Process Outsourcing Service Provider Landscape with PEAK Matrix Assessment evaluated 21 established RPO service providers based on the absolute as well as relative year-on-year movement for specific criteria, including market success, scale, scope, technology capability, delivery footprint and buyer satisfaction. The providers were then categorized into three categories: Leaders, Major Contenders and Aspirants. Leaders, like Cielo, were placed in the top quadrant for both market success and delivery capability.

Cielo was highlighted specifically for the launch of Cielo TalentCloud, a suite of three technologies that includes: SkyRecruit, an exclusive CRM platform that provides the most advanced and recruiter-friendly tools for targeting, nurturing and engaging top talent; SkyAnalytics, a platform that provides prescriptive and actionable insights from market and internal data sources; and SkyLabs, an innovation engine whereby Cielo tests and pilots new and emerging technologies, tools and processes to understand how they could (or would not) help clients reach their goals.

“Cielo’s focus on enhancing its technology and developing new and innovative solutions for its customers has enabled it to stay ahead of the competition, which is reflected in Cielo being consistently featured in the Leader’s quadrant in Everest Group’s RPO PEAK Matrix,” said Arkadev Basak, Vice President, Everest Group.

About Cielo

Cielo is the world’s leading strategic Recruitment Process Outsourcing (RPO) partner. Under its WE BECOME YOU™ philosophy, Cielo’s dedicated recruitment teams primarily serve clients in the financial and business services, consumer brands, technology and media, engineering, life sciences and healthcare industries. Cielo’s global presence includes 2,000 employees, serving 154 clients across 92 countries in 36 languages. The industry has verified Cielo’s reputation for executing innovative solutions that provide business impact through numerous awards and recognitions, including its #1 position on the HRO Today RPO Baker’s Dozen listing, PEAK Matrix Leader placement by Everest Group and Industry Leader designation by NelsonHall. Cielo knows talent is rising – and with it, an organization’s opportunity to rise above. For more information, visit cielotalent.com.

Cielo Contact:
Matt Quandt
matt.quandt@cielotalent.com
+1
262-439-1673

 

SOURCE Cielo

 

Article source: http://www.prnewswire.co.uk/news-releases/cielo-recognized-as-a-leader-in-recruitment-process-outsourcing-for-sixth-consecutive-year-on-everest-682146311.html

Government contracts still driven by price

Price still main driver in outsourcing selection

Outsourcing sector bosses have told MPs that the Government’s procurement proposition had gone “too far” in a quest to keep costs down and that the system needs overhauling, in the wake of Carillion’s collapse.

Speaking to a parliamentary select committee on Tuesday morning, Rupert Soames of Serco said that in his four and a half years leading the company, the only contract he could remember winning on any factor other than price was to manage facilities for Barts Health NHS Trust. Mr Soames said this proved that Government outsourcing was still mainly based on cost, rather than the expertise that private companies could offer.

Phil Bentley, chief executive of Mitie who was also appearing before the committee, said: “There’s always this drive to the lowest price as the easier answer.” He added that he thought more conversations between the public and private sector prior to a contract being awarded would help. “Innovation is taken out of the bids because the OJEU rules [for tendering work] are about creating a level playing field,” he said.

The committee was meeting as part of a wider investigation into the way the Government uses the private sector for services such as running schools and prisons, following the collapse of outsourcing company Carillion in January.

Article source: https://www.telegraph.co.uk/business/2018/05/08/outsourcing-bosses-say-government-contracts-still-mostly-awarded/

A crisis in local government outsourcing

News of the latest outsourcing giant to hit choppy waters

Following the collapse of Carillion in January and the losses reported by Capita the announcement of a massive drop in Interserve’s share price comes like the arrival of the proverbial third bus.

And although each company is different they have certain similarities which raise important questions about the balance between the public and private spheres.

All three are – or were, in the case of Carillion – companies spanning the continents and offering services in a dazzling array of sectors.

Capita is very much a child of local government – started back in the 1980s when senior CIPFA staff saw an opportunity to set up on their own and provide outsourced services to councils – but quickly grew into a multinational business operating in Europe, Africa and Asia, with about half its business in the public sector and the other half in the private sector.

Most of Carillion’s business was in the United Kingdom, but it also operated in several other regions including Canada, the Middle East and the Caribbean.

Interserve, the latest to run into trouble, operates in more than 40 countries, providing services to a wide range of industries including oil and gas, civil engineering and construction and providing facilities management at UK embassies throughout Europe.

Business logic might suggest the wide range of skills and experience offered by this kind of international, inter-sectoral organisation can be a big plus. Local government and other parts of the public sector – the NHS, for example – can benefit from the entrepreneurialism and know-how of senior personnel in business. Oil and gas industry executives no doubt have much to offer town hall managers.

But such size and diversity can also be a weakness. Like the Roman Empire, when an organisation becomes too big and geographically spread, it can become difficult for its different wings to co-ordinate and follow the same overall objectives, potentially leading to confusion, duplication and waste. Nevertheless, giant outsourcing companies have become part of the local government landscape and many councils depend on them. Further crises would be bad news for all concerned, not least the employees whose jobs may be threatened.

Unlike Carillion, Capita and Interserve have time to turn their businesses around and look forward to better times. Capita points out that its reported losses were caused by a write-down of goodwill and that its underlying profits actually amounted to £400m.

But taken together the recent spate of crisis stories suggests a picture of local authorities and other parts of the public sector beholden to huge multinationals at the mercy of uncontrollable market forces. It seems to suggest that for all their advantages, massive multi-national conglomerates operating across a wide variety of sectors may not be the ideal partners for the more focused and stability-minded world of local government.

 

Article source: https://www.localgov.co.uk/A-crisis-in-local-government-outsourcing/45223

Aberdeen council leaders won’t rule out more outsourcing in bid to save £250m

Aberdeen council leaders have refused to rule out outsourcing services.

The authority is aiming to trim around £250 million from its budget in five years as part of a massive restructure and has already proposed cutting 370 jobs through voluntary and early redundancy.

Yesterday members of the strategic commissioning committee clashed when the opposition SNP group proposed a ban on any council services being outsourced in the future.

Group leader Stephen Flynn said: “I believe we now have the option to draw a line in the sand and tell our staff there will be no more outsourcing of services.”

But Aberdeen Labour’s Sarah Duncan told members she was “not ideological” about the issue – pointing out that services such as road repairs and cleaning were already often carried out by contractors.

The SNP amendment was defeated by six votes to three.

Article source: https://www.eveningexpress.co.uk/fp/news/local/outsourcing-could-be-used-to-make-savings/

What the outsourcing sector can learn from the Carillion collapse

The death of Carillion triggered an outpouring of theories about the shortcomings of public sector outsourcing. But it’s worth remembering that until that fateful moment in January, the industry was basking in a long series of successes. 

At the beginning of the year, 200 small companies a week were signing up to the Contracts Portal, a scheme created by the government for small and medium-sized enterprises (SMEs) to compete with big corporations for outsourcing contracts. The supplier list passed 22,000 members, up 53 per cent in 12 months. Great news. 

The government announced it was on target to get one pound in three of outsourcing contracts awarded to SME bidders by 2022. That’s two years later than planned, but still good news. 

And the Crown Commercial Service (CCS), founded in 2014 to improve the negotiating firepower of the civil service, was growing into a mature, sophisticated troubleshooter. 

All seemed rosy. And then Carillion went bust and the entire industry went into panic mode. Leader of the Opposition, Labour’s Jeremy Corbyn, called it the beginning of the end for outsourcing. Confidence evaporated and ever since the outsourcing industry has been in crisis. 

Front and centre is the problem of oversized bidders. Yes, the SME bidders were growing in numbers, but the mid-size were getting hammered. 

Denise O’Leary, boss of Purpol Marketing, a construction industry bid-writing specialist, witnessed the calamitous erosion of mid-sized bidders. “The government procurement method of placing value on the largest organisation had already driven the collapse of many regional and local contractors from the supply chain,” she says. 

“In the recession, the larger companies chose to win projects at under cost to keep their teams busy, and cash-strapped local authorities took as many savings as they could, even if they recognised that their selection may not be sustainable for the duration of the project.” 

Big bidders built a track record of handling big contracts, despite the wafer-thin or negative margins, leading to further bid wins. Ms O’Leary puts it this way: “Large companies had most evidence of past projects, which were given highest weighting in the assessment model. Many of these had to be in the last five years, so only organisations that had chosen to make a loss and taken jobs then had the case studies to prove their suitability for future ones. This then squeezed more and more contractors out of the supply chain.” It was a recipe for disaster. 

A second glitch in the matrix came from the evaluations used by councils. Helen Randall, partner at law firm Trowers Hamlins, helped negotiate contracts with Carillion and other major public sector contractors. She says the quality component was underrated. “Most PFI [private finance initiative] contracts were evaluated on a 60 per cent quality, 40 per cent price basis. However, experience has shown that if a public authority applies an evaluation ratio where price represents anything more than 30 per cent, then inevitably price will always trump quality,” says Ms Randall. 

Another problem is that falling budgets led to falling expertise. Paul Dossett, head of local government for Grant Thornton UK, saw this at first hand. “Headcount reduction due to prolonged austerity has led to many councils not having the necessary in-house expertise to effectively draw up contracts and procure suppliers, or the capacity and capability to undertake the correct due diligence and effectively monitor contracts once they have been let,” he says. 

His view is supported by Ms Randall. “As someone who has worked in both the public and private sectors, I can see both sides of the picture, but many civil servants haven’t had enough exposure to the business world so they don’t understand how a bid is put together,” she says. “I’m a strong advocate of seconding civil servants into the commercial sector so they can understand how profit is calculated and why you need contracts that will allow contractors to make a profit to stay afloat.” 

In theory, the CCS should swoop in to help beleaguered public entities. But it failed to notice the problems at Carillion. And it arguably failed in its primary duty to help state bodies negotiate viable deals with the company. Worse, claims James Bousher, manager at consultancy group Ayming, the CCS strategy for helping is flawed. 

“The CCS attempts to do this through maintaining their strategic supplier list, but it only contains around 30 suppliers and their impact doesn’t reach far beyond central government,” says Mr Bousher. “There’s no clear path, or necessarily even expectation, for a devolved contracting authority to raise concerns as a warning to the wider public sector. Even then, once bad contracts are identified, the real challenge is working out what to do with them.” 

While we tend to only hear about failing outsourcers and failing councils, in reality the majority of councils are well run and most outsourcing contracts deliver what was promised

Other helpful tools are underused. Open book costing, where margins are agreed with the buyer, are promising, but not routine. The British Standard on collaboration (BS 11000) has a poor take-up. In theory, it should help contracts be shared between multiple parties, removing the “winner takes all” problem of today. 

In the final analysis, it is worth remembering that most public sector contracts work well. As Mr Dossett of Grant Thornton puts it: “While we tend to only hear about failing outsourcers and failing councils, in reality the majority of councils are well run and most outsourcing contracts deliver what was promised.” 

Carillion should not undermine the role of outsourcing in the public sector. An ecosystem of outsourcers can supply agility, technical expertise, focus and a motivated workforce. But the landscape must change. Mid-sized bidders must thrive. Contracts must include larger provision for quality. The public sector must foster talent, experienced in deal-making and the CCS is critical in that mission. 

Above all, the public sector must learn that outsource partners can’t live on below 3 per cent margins. Cheap does not equal good. If Carillion offers one lasting lesson, let it be that one. 

Article source: https://www.raconteur.net/business/outsourcing-sector-can-learn-carillion-collapse