Natural Language Processing (NLP) in Recruitment - CloudCall

Natural Language Processing (NLP) in Recruitment

By September 9, 2021Recruitment, Technology
man using natural language processing technology on phone

In my previous article I covered some of the interesting Artificial Intelligence trends I see currently in the recruitment market.

I’d like to dive a little deeper into one of these areas that’s come to the fore recently as we at CloudCall look to extend our offering and provide improved insights for our customers. In this article I’ll examine ways in which natural language processing (NLP) techniques can be used to reveal interesting information. This information may be embedded in your systems but isn’t immediately apparent or relevant. It’s all about understanding your customer’s voice and allowing you to better tailor your solution to meet their needs.

What is Natural Language Processing?

Natural Language Processing (NLP) is a general term that encompasses the technologies, processes and approaches that can be applied to extract meaning from unstructured text data. This data may be transcripts of phone calls, notes written up about a customer, or an archive of resumes. For text data that is pulled out of a voice call, a data pipeline will typically be used, with a Automatic Speech Recognition (ASR) component. This pre-processes the audio so that output text that can then be operated on by the NLP. Regardless of the source format, the outcomes that recruiters are interested in will include:

  • Searching for key words (for example certain skills or traits)
  • Picking out sentiment from conversations (whether a positive or negative outcome is predicated)
  • Applying domain-specific categories to data.

Language is a fundamental part of what makes us human. So it makes sense to apply natural language processing techniques to linguistic data in order to better understand our customers and how we interact with them.

Level up your CRM

Imagine for a moment that you’ve been using a CRM for years to keep track of contacts and candidates.  However, you have the feeling at the back of your mind that you’ve been missing out on valuable information from the call that isn’t being captured in the record. For example, when agents are conversing with clients there may be words or phrases that are commonly used but just aren’t being picked up in the call notes (because the agent doesn’t see them as important or doesn’t have time). The ability to capture and add these keywords is an obvious example of using automatic transcription to build links between records and capture important insights.

Keeping audio records of calls is great, but it is difficult for humans to trawl through recordings. Especially when listening out for and recording single keywords. Imagine the time you could save if you could get the machine to do this type of work.

woman using natural language processing on call

CloudCall will soon support automatic transcription of voice calls, allowing transcripts to be associated with call recordings in the CRM. This allows recordings to be searchable in a way that was previously impossible. Not only that, but further analysis of the transcript using NLP techniques can then be added to the basic transcript. And call compliance requirements can be covered by keeping an accurate record of client conversations in secure storage.

Diarization explained

Diarization is the automatic labelling of speakers to show who spoke what, and when. Human conversations are turn-taking activities: we say something to someone and this elicits a response from the interlocutor. Either the question or statement was understood, or if not, further clarification is requested until the ‘conversational goal’ is achieved.  This could be agreeing to meet up for a coffee with a friend, or establishing a date for contract renewal with a client. The steps along the way may not be that important in themselves, but their manner and timeliness may well be crucial for getting an overview of the tone of the interaction.

You should be aware of customers that are  getting frustrated because it’s taking too long to handle their needs. By doing so, you can put in place better training and improve future outcomes. Sentiment analysis is a technique that can use cues in the speech stream (tone of voice, periods of silence, over-talk and so on) as well as the transcript itself, to figure out whether the conversation is going well, or not. This can be done in real-time – allowing supervisors the opportunity to intervene if needed. Or post-event in order to build up a record of good versus sub-optimal interactions. This is useful for staff training and evaluation

Finally, by spotting key words and phrases in contact calls as well as linking sentiment analysis, an aggregated picture of the service can be built up across all channels (voice, chat, email, web and social media, mobile). Not only that, but a fuller picture of the customer journey emerges. A real-time heatmap of customer interactions across the organisation can be built up, so that problems can be flagged early.

The future is bright

It’s an exciting time to be using these new technologies and applying them to these problems. Byaccessing unstructured data that exists within the call and its associated notes, you gain the ability to find deeper insights about your customers and touch points with them. It’s not possible to fully understand your customers until you are able to hear their natural voice. Forms and surveys are, by their nature, artificial ways of gathering customer feedback. As soon as you start asking questions you risk leading the user to an answer you’re expecting. By mining the words that customers speak as they interact with your service you gain insights that are otherwise lost. In this way you truly can discover the so-called  ‘voice of the customer’ and act upon it.

James Page

About James Page

James is a Product Manager here at CloudCall, based in the UK. His expertise lies in speech technology and NLP, machine learning, digital media, Martech and networking.