The New Arms Race

When it comes to safeguarding our democracy, much has already been said about the need for more transparency and accountability governing how data can and should be used on digital platforms. Can transparent technology evolve faster than tech used for ill-gotten gains?

With the knowledge that foreign agents used open platforms to influence the 2016 election – and that we must be vigilant to avoid further interference this year – effective methods to combat such actions are top of mind for business, industry, consumers and government officials alike.

As the specifics are better understood, one thing becomes increasingly evident: To achieve this, the technology of transparency and accountability needs to evolve faster than the technology used for corruption and manipulation.

Consider the source

In normal “offline” human discourse, we usually know the person with whom we are speaking and can adequately judge the credibility of the information exchanged. Similarly, when we read a newspaper or magazine, we know the identity of the author, publisher and even advertisers that support the publication, all which provide context for gauging the veracity and perspective of the content.

Similarly, to assess the integrity of information in the digital space, knowing the source is crucial. But ascertaining the source of online information can be difficult, and in many cases the capabilities of the technology have outpaced advancements in the laws regulating online activity.

In the case of the US elections, the Federal Elections Commission (FEC) has neither the tools nor the resources to step in to effectively counter bad political actors. Political speech is currently regulated based on attribution of the source of funds. The FEC can easily monitor and police the flow of money in traditional print advertising, radio and television spend.

However, in the digital realm it is much easier to obscure the source of the speech and spend, and it becomes difficult to know where this information originates geographically and organizationally.

Lessons learned

So, how can we forge a path forward? First, we can look to the past.

A couple of years ago, the marketing industry began to face the growing popularity of ad blocking and its threat to the traditional advertising model. In response, key industry associations, including the IAB and DMA, immediately developed codes of conduct and new advertising principles to improve advertising standards and promote consumer choice.

In our current situation, today’s platforms must work together to improve transparency into political advertising and funding and assume accountability for preventing harms. By building ethical guidelines into the design of their technological systems and processes they can not only more effectively vet the intent and origination of the users and advertisers that come to their platform, but they can equally empower consumers to make data-driven decisions about the source of the information they consume there.

Considering the global nature of today’s leading online platforms, embracing data ethics by design can help guide and ensure the continued success of these digital companies. By defining and adhering to core values that promote the ethical use of data, companies can self-regulate and gain the trust of the people they serve. With greater clarity and transparency to the sources of information, this goal is attainable.

Act or be acted upon

It was foreseeable that bad actors who wanted to influence our electoral process would be able to do so using today’s technology platforms, including news websites and other publishing platforms that provide a setting where groups of like-minded individuals can converge. Acknowledging what occurred after the fact, however, has limited value. It is equally foreseeable that the federal government will attempt to regulate data to prevent future corruption or technological misuse, and if so, we should support appropriate regulation.

The leaders of digital platforms have far more power and authority over their technology than governments do and should act aggressively to clearly define methods of bringing transparency and accountability into the platforms we use for everyday communication to prevent nefarious misuse.

This article first appeared in AdExchanger on Thursday July 24th, 2018

Collateral Insights: Turning Performics Into a Consumer Insights Engine

Post by Esteban Ribero, SVP, Planning & Insights

As search marketers, we have been driving an insights generating machine for quite some time. A machine that produces insights about human behavior, and consumer behavior in particular. We just did not know.

People turn to search engines with clear intentions. They are looking for guidance, they are looking to explore and get inspired, they are also looking for specific details as they evaluate their options, and of course, they are there to take action such as buying, signing up for a service, finding a coupon, getting directions, etc. We all know that, but, can we capture that behavior and see it happening over time? Wouldn’t it be great to show our clients how consumers are approaching their category, when there is a raise in exploratory behavior, when there is deep evaluation or when it is most likely to catch them with a buying mindset? We can now.

Performics is now joining the growing business trend of finding alternative uses to the data that has been collected as part of normal business. Every day Performics collects millions of data points indicating how many clicks and conversions we got from our bidding activities across tens of thousands of keywords in each of our accounts. We typically look at the data from a performance perspective to see how efficient we are at converting demand into revenue. But what if we decided to look at the same data with a different lens? A human lens? Wouldn’t certain keywords indicate that consumers are exploring, others indicate that they are looking for guidance and understanding, and yet another set indicate that they are evaluating options but not ready to buy yet? At Planning and Insights, we think so, and have created a methodology to set the infrastructure that allows us to read the massive amount of data we are collecting every day with a human lens and see actual consumer behavior happening before our eyes.

Our Intent Scoring Algorithm allows us to tag the keywords in our search accounts with the mindset that the searcher was in when typing those keywords. It is not science fiction; it is a simple principle: people pay attention to tangible features of reality as they get closer to accomplishing a goal and they use more concrete language when talking or thinking about their goals than when they are more distant from them. This is a phenomenon well established in psychology; we just turned that insight into a technology to score search terms based on their level of concreteness. Once we code the terms we can aggregate them into meaningful buckets according to the psychological distance from the goal. The different buckets represent different mindsets and intentions. This is the set of lenses we use to look at the aggregated activation data generated by our daily operation and collect the collateral insights about consumer behavior that are waiting to be uncovered.


The Intent Scoring Algorithm sits at the core of Intent-Based Planning and our ability to bring to life the human being behind the clicks. It has been said that we are in the area of data-driven storytelling. I would add that, to do it properly, we also have to be story-led data crunchers, and that is exactly what we are doing.

To learn more, contact Performics today.

The post Collateral Insights: Turning Performics Into a Consumer Insights Engine appeared first on Performics.

Playing detective: how to identify bad backlinks

I completed a backlink audit recently, and this is the post I wish I’d had when starting the tedious task of identifying the nasty links. Not all dodgy links are obvious, heck some are even near-impossible to find, especially when you have a spreadsheet containing thousands of them.

This is not a post about how to do a backlink audit from A-Z – that’s already been written about loads of times. Instead, I’m going to take you through how to identify patterns in your backlink data to quickly and accurately uncover spammy links.

I’ve written this post for the greater good of all SEOs, and yes, you’re welcome.

Wait – do I even need to do a backlink audit?

There has been some confusion since the last Penguin update as to whether or not SEOs even need to carry out backlink audits anymore. After all, Google has said that now they only devalue spam links as opposed to penalising the site receiving them, right?

Well, the short answer is: yes, you probably should continue to carry out backlink audits and update your disavow file. You can read more about this here and here.

Why can’t I just use an automated tool to find the bad links?

I know it’s tempting to get automated backlink tools such as Kerboo to do all the hard-lifting for you. Unfortunately, though, this isn’t a great idea.

In the backlink audit I did recently, 93% of porn links were assigned a link risk of ‘neutral’ with a score of 500/1,000 (0 being the safest link and 1,000 being the riskiest). Links from the BBC also received a ‘neutral’ rating, with some getting a higher risk score than the porn links! Go figure.

Automated backlink tools can be super valuable; however, this is because of all the data they draw together into a single spreadsheet, as opposed to them being particularly accurate at rating the risk of links. To rely solely on their link risk metrics for your backlink audit is a quick ticket to trouble.

Is this guide relevant to my site?

This post is not a ‘one-size fits all’ strategy to a backlink audit, so please use your common sense. For example, below I recommend that root domains containing the word ‘loan’ are generally indicative of unscrupulous sites. However, if you’re doing a backlink audit for a financial services firm, then this generalisation is less likely to apply to you.

It’s up to you to think about the guidelines below in the context of the site you’re auditing and to adjust accordingly.

You will need

Before you start, you will need to have all your backlinks neatly assembled in a spreadsheet along with the following information:

  • URL (one example per linking root domain)
  • root domain
  • anchor text
  • citation flow (Majestic) or domain authority (Ahrefs or Moz)
  • trust flow (Majestic) or domain trust (Ahrefs or Moz)
  • backlinks
  • IP address
  • title
  • page language
  • link location
  • and anything else you can think of that could be useful

This article can bring you up to speed if you’re not sure how to assemble this data. Make sure to combine data from as many sources as possible, as different SEO tools will contain different information and you don’t want to miss anything! As I said earlier, I would also recommend Kerboo as one of your data sources, as it pulls a lot of the information you could want into one place.

How to spot the patterns

Fortunately for us, the bad guys almost always do their dirty work in bulk, which makes life easier for us good guys who inevitably have to clean up after them. It’s rare to find one dodgy directory submission or a single piece of spun content containing a paid link. This is a massive help – use it to your advantage!

Pivot Tables

I highly recommend creating a pivot table of your data so that you can see how many times an issue has occurred in your data set. This can help you to quickly spot patterns.

Above: spotting suspicious anchor text using a pivot table

For example, let’s say you’re doing a backlink audit for a clothing site. By pivoting for anchor text, you might be able to quickly spot that ‘buy cheap dresses’ appears several times. Given the commercial nature of this anchor text, it’s likely it could be spam. You could spot check some of these URLs to make sure, and if they’re consistently dodgy, you can reasonably assume the rest of the links with this anchor text are too.

Above: putting together a pivot table to spot anchor text frequencies (view large version of gif)

Word Clouds

Another thing I like to do is to dump my data into a word cloud generator. This is useful because it visualises the data (the bigger the word, the more times it appears in your dataset). It can help me to quickly catch something that looks like it shouldn’t be there.

Keeping on top of your data

Make sure you make a note as you work that explains why you’ve decided to disavow a set of links. It helps not just at the end when you’re reviewing your links, but will also be a big help when you come to spot patterns. It will also stop you from revisiting the same links multiple times and asking yourself ‘why did I decide these were bad links?’

Above: screenshot from my recent backlink audit with ‘action’ and ‘reason’ columns

Examples of common patterns to find bad backlinks

I’m now going to give you specific examples of bad links which you can use to find patterns in your data.

It’s not always a clear-cut answer as to whether a link is spam or not, however, the guidelines below should help guide you in the right direction.

When you’re unsure about a link, ask yourself: ‘if it wasn’t for SEO, would this link even exist?

Words to look for in the root domain or URL

X-rated words in the URL

You’ll immediately want to disavow (unless of course, these are relevant to your site) any x-rated links. These usually contain one of the following terms in their URL:

  • porn
  • xxx
  • sex (also sexy can result in some shady sites)
  • escort
  • dirty
  • adult
  • and any more dodgy words you can think of that relate to orgies, orgasms and other obscenities

Be careful not to accidentally disavow URLs where ‘sex’ is in the middle of a word – such as or This will require some manual spot checking.

Root domain contains references to directories & listings

Next, you want to look for any URLs that indicate manipulative SEO link-building tactics. Directories are an obvious example of this, and while not all directories are bad (here is a good article on how to tell the difference), generally those created purely for link-building purposes contain the following words in the root domain:

  • ‘directory’ – especially ‘dir’ and ‘webdir’
  • ‘links’ – especially ‘weblinks’, ‘hotlinks’ or ‘toplinks’
  • ‘listings’

You might notice I’ve specifically said ‘root domain’ as opposed to ‘URL’ here. There is a reason for this: you might find lots of URLs in your dataset where ‘links’ is in the URL path. As a general rule, these are a lot less likely to be manipulative links. Compare with One of these is spam, and the other isn’t – can you spot the difference?

Root domain contains references to SEO

You’ll also find that if the root domain contains SEO or web-related terms, it’s likely it exists simply to serve the purpose of building links. Look out for the following words in the root domain:

  • ‘domain’
  • ‘search’
  • ‘seo’
  • ‘web’

Bear in mind that lots of sites have ‘search’ pages, so your best bet is to focus on the root domain for this to be an indication of anything suspect.

Content farms are another common feature of a poor backlink profile. Look for any domains that contain ‘article’.

Other dodgy root domains

The following keywords in the domain are usually indicative of dodgy link-building practices:

  • ‘cash’
  • ‘loan’
  • ‘com’ (such as – yes, really)
  • ‘world’
  • ‘ads’

Root domain contains consonant or number clusters

Another obvious sign is any root domains which simply do not make sense. You’ll likely have lots of domains linking to your site consisting of bundles of consonants and letters, such as ‘’ or ‘’. Watch out for domains like these, as more often than not they are low quality.

You can easily find URLs like this by sorting your root domain column from A-Z. You will find that:

  • any domain starting with a number will appear at the top of your list.
  • scrolling to the bottom to letters x, y and z usually throws up lots of domains with consonant clusters that do not make sense.

The ccTLD is uncommon

Uncommon ccTLDs are usually indicative of dodgy sites. Any site worth its salt will try and obtain the .com, .net, .org, .edu or relevant country ccTLD for its domain name. The less common ccTLDs are an indication of a lower quality site and those examples I found in my most recent backlink audit which indicated spammy sites were:

  • .biz
  • .casino
  • .clothing
  • .ooo
  • .properties, etc

Looking at titles for further clues

When the domain name or URL isn’t particularly insightful, the page title is the next place to look. Look out for the same keywords listed above, as well as the following phrases:

  • ‘most visited web pages’
  • ‘reciprocal links’
  • ‘link partner’
  • ‘link exchange’
  • ‘seo friendly’

Another clue is to find any site titles that are completed unrelated to the niche of your site. Titles that contain commercial terms are particularly suspect, such as 

  • ‘louis vuitton belts’
  • ‘nike shoes’

As I mentioned before, bad backlinks often operate in bulk, and there’s nothing like a load of duplicate titles to lead you hot on the heels of a group of spammy URLs.

What can anchor text tell us?

Is it keyword-heavy?

A popular SEO tactic in the pre-Penguin days was to link to your site with keyword-heavy or commercial anchor text, such as ‘cheap red dresses’.  Make sure to put together a pivot table of your anchor text so you can quickly scan for any recurring anchor text that looks suspiciously well-optimised and check out these links to see if they’re legit – they probably aren’t.

Does it make sense?

In addition, any anchor text that simply doesn’t make any sense or is completely unrelated to the site you’re auditing is highly likely to be low quality.

Is the language consistent with the rest of the page?

Finally, any anchor text that is in a different language to the rest of the content on the page is likely to be a paid link. You can use the ‘language’ column (provided by Ahrefs and Kerboo) to see what language the page is in, and you can compare this to the language of the anchor text of your links. Anywhere where there is a mismatch is likely to be suspicious.

Duplicate root IP address

Pivot your data to see if there are several with the same IP address. If there is a block of URLs that share the same IP address and one of these is spammy, it could be likely that the rest are too.

Make sure to do a manual spot check of the sites to make sure you’re not disavowing anything harmless. For example, sites hosted at and are commonly hosted at the same IP address, and many of these will be harmless.

Where on the page is the link located?

In many backlink reports, there’s a column which tells you where on the page the link is located. In Kerboo, this column is called ‘link section’, and it’s another nifty tool for us to use in our hunt for dodgy links. Filter this column for keywords contained in the footer and sidebar to see if there are any which look suspicious on opening the page.

Footer and sidebar links are prime locations for dodgy backlinks. Why? Because these are site-wide, they are often targeted for paid link placements as the recipient of the link can often benefit from the most link equity in this way.

In addition, if the link is providing no value to users on the site (for example, if it’s completely unrelated to the site content, which is likely if it’s a paid link) then the footer is a useful place to essentially ‘hide’ the link from users while still providing link equity to the recipient.

Where is the link pointing to?

In the ‘link to’ column, look out for links pointing to the ‘money pages’ on your site – these are any pages which are revenue-drivers or particularly important for other reasons, such as product pages or sign-up pages.

It’s natural in a backlink profile to have the majority of links pointing to your homepage; this is where most people will link to by default. It’s much harder to build links to pages deeper in a site, especially product pages, as it’s not particularly natural for people to link here.

By glancing an eye over links which point to money pages, it’s likely you could spot a few suspicious links which have been previously built to help boost the rankings of important pages on your site.

Taking things to the next level

All the tips I’ve shared with you so far have involved mining data that is easily accessible to you in your backlink spreadsheet – things such as root domain, URL, page title and anchor text.

To take your backlink audit up a level, it’s time to get savvy. This is where Screaming Frog comes in.

Using Custom Search to spot link directories

You know how earlier we mentioned that not all directories are bad? Well, an easy way to spot if a directory exists solely for link-building purposes is to see if the page contains phrases such as ‘submit link’, ‘link exchange’ or ‘add your site’.

These telltale phrases will not necessarily be in the URL or page title of your link, so this is why it’s necessary to take things up a step.

To find pages which contain these terms, you can run a crawl of your backlink URLs using the Screaming Frog Custom Search feature.

Above: using Screaming Frog ‘Custom Search’ to find web pages containing suspicious text

Once the crawl is finished, you can then download the URLs that contain the phrases above. These will most likely be some obvious link directories that you’ll want to disavow pretty sharpish.

Using Custom Search to spot spun content

The Screaming Frog custom search feature isn’t just useful for finding directory links. This is where you really need to put on your detective hat and to have a good think of any patterns you’ve noticed so far in your backlink audit.

When I did my audit recently, I noticed a recurring theme with some of the paid links. There were links to other sites with commercial anchor text that kept appearing alongside the link to the site I was auditing. This was a piece of spun content that had been copied and pasted across multiple sites and forums, and whoever had done the work was clearly being lazy, lumping a load of unrelated links together in one paragraph.

Apart from the fact the text made no sense whatsoever, the anchor text of these other links was extremely commercial: ‘cheap nike free run 2 for men’ and ‘chanel outlet UK’ where a recurring theme.

Above: example of spun content that appeared in my recent backlink audit

I’d tried to find a pattern in the URLs or titles of these pages, but it was a bit hit and miss. It was then that I realised I could do what I had done to find the directory links – Screaming Frog custom search.

I, therefore, carried out a Screaming Frog crawl that looked for recurring anchor text such as ‘cheap nike’ and ‘chanel outlet’ to identify any URLs that I hadn’t yet uncovered. It was extremely useful and allowed me to identify some URLs that up to that point I had been unable to identify from the data in my spreadsheet alone.

To wrap up

If you’ve made it this far, congratulations! I appreciate this post was a lot of writing, but I hope it’s really helped you to dig out any dodgy links that were lurking under the surface.

If there’s one thing to take away, it’s to look for any patterns or consistencies in the dodgy links that you find, and to then use these to dig out the less obvious links.

Do you have certain criteria that you find helpful when identifying bad backlinks? Comment below!

Saatchi & Saatchi Synergize Team Rebrands as Performics South Africa

Performics welcomed the Saatchi & Saatchi Synergize team to its network on July 23, 2018, rebranded as Performics South Africa. This new Performics team is a digital marketing agency delivering performance marketing as well as creative and strategic services to a diverse range of clientele, both nationally and internationally. Globally, Performics offers display, search, performance content, social and programmatic buying campaigns to clients.

Headquartered in Cape Town, Performics South Africa plugs its expertise into the greater Publicis Groupe Africa network, resulting in a 360 degree, fully integrated capability that allows the agency to compete at the top tier of the advertising and marketing industries.

This award-winning team brings a diverse range of capabilities to the Performics brand, including several proprietary search methodologies. The team’s worldwide language localisation capability includes over 110 languages and dialects, alongside proven performance on both local and global multinational accounts.

Through the global Performics network, South African clients will benefit from the following advantages:

  • Strategy – Performance marketing is seen by Performics as an integral part of the media planning and buying process.
  • Specialists & tools – Performics has an experienced team of specialists who work with best-in-class tools at a global level.
  • Global thought leadership and proven case studies
  • Size – Performics has a global presence.


About Performics 

As the original performance marketing agency, Performics is the premier revenue growth driver for many of the world’s most admired brands. Across an expansive global network operating in 57 countries, Performics leverages data, technology and talent to create and convert consumer demand wherever it is expressed—search, social, display, commerce and offline channels. Performics is built for the relentless pursuit of results. Headquartered in Chicago, Performics is a Publicis Media company. To learn more, visit

For further information, please contact:

Jason Smit
Managing Director
The Harrington, 50 Harrington House, Cape Town, 8000
T : +27 21 413 7500  M : +27 82 308 1782

The post Saatchi & Saatchi Synergize Team Rebrands as Performics South Africa appeared first on Performics.

Aesop Storytelling Series; Season 2 Episode 2

A short history of stories

There wasn’t a lot to do in the Stone Age. Bit of hunting, bit of gathering. The usual. That’s why some bright hairy spark came up with storytelling: not only was it a great way of relaying info, it was fun too. They made ‘em catchy so people would remember them.

Some say part of the reason our Neanderthal cousins weren’t as successful as us was because they had no stories. They could communicate, but because they couldn’t tell stories they couldn’t easily pass on vital information to others, including the next generation. Li’l Neanderthal kids couldn’t improve on their parents’ techniques, they just had to start over again from scratch. Not only that, no stories meant no societies, no belief systems, no sense of shared identity. It was a big bad world and they couldn’t cut it. Homo sapiens on the other hand were a dab hand. And the rest, I guess, is history. As in it actually is history.

Stories, like your first sexual experience, began around the campfire. Stories were passed down through poetry or song, from one generation to the next. But as time went on, the tales got taller. You just couldn’t rely on them. So someone invented writing which was obviously great. The only problem was that not everyone could write. Writing, and therefore the flow of information itself, belonged to the elite. It’s what allowed feudal society to flourish for centuries: keep the masses stupid and they’re a lot easier to deal with.

Then you get ol’ William Caxton, and Gutenberg, and all those other white men who took movable type mainstream (they’d been doing it all over East Asia for centuries but it just hadn’t really taken off). That hip little religious document only you knew about? Now it was all over the place. Monks were spitting out their flat whites in disgust.

Soon just about anyone with some knowhow, money and hutzpah could start distributing their own information on a relatively massive scale. Books, which had once been so heavy and so expensive they had to be chained to the library, and came in any quantity you liked as long as it was one, could now be passed from person to person, or even sent across the ocean. People, who up until then had just been informed (if they were lucky), were now connected. Feudalism was out: it was time to wave hello to parliamentary democracy. (Most of the time. With a few caveats. And not everywhere, obviously).

People just couldn’t get enough of being connected. And about a million inventions, tweaks and minor electric shocks later, we had kit that could send light and sound thousands of miles in a matter of seconds. Remember our friend Disney from Episode 1? He was a man who understood another milestone in this tale: mass audio-visual communication. Radio. Cinema. TV. Mass A-V made it even easier for popular movements to spring up, everything from Martin Luther-King’s speeches to Stalinist propaganda. Take-out: use it wisely.

And of course, this kind of mass media heralded the golden age of advertising—but sadly for us, it wasn’t to last. People got bored of just being connected. The next step was to become empowered. More overflowing wastepaper baskets, and we’d managed to invent the internet. Creating and disseminating your own information was now not only free, you could do it anywhere. It was about ten times easier than replacing the cartridge on a fountain pen, let alone setting up your own printing press or inventing an alphabet.

Now blockchain, subject of a thousand lines of LinkedIn Vogon poetry, is about to ratchet that up a gear. A distributed network will allow us to interact with information in a way that will make us, like the title of this series suggests, hyper-connected, and therefore hyper-empowered.

More than this, the crazy march of technology doesn’t replace one way of telling stories with another, it just creates more. The pen wasn’t replaced by the printing press, and the computer didn’t replace broadcast TV. And once we have blockchain, or whatever hell next thing turns up and screws around with the order of things, we’ll still have all the rest of that shit (yes, you do still have to write your nan a thank you note is what I’m saying). The world just keeps getting more complex, and as adwomen and admen we have to deal with that world. People went from doing what they were told to being able to do something about it—and reaching them is, if not quite impossible, then a total ball-ache.

Now that the information flow wasn’t a one way street, brands had to go from interrupting consumers’ lives to trying to infiltrate this hyper-communicated network: creating stuff people would seek out and want to consume. Content marketing, ya-dee ya-da, heard it before—but data made it a bit more fun. Data meant that you could get the right message (bag of monster munch), to right person (drunk man), to the right place (outside the takeaway), at the right time (when the takeaway has just closed). But without something to hold it all together, this message is about as useful as, well, a bag of monster munch. Yes, data can be the bridge between your ad and the consumer, but without a story—hell, a storyverse—to tie all the bits together, the consumer is never going to experience a cohesion of message. They’re never going to connect up all the piecemeal dispatches you send them into a unified whole—a brand storyverse, if you will.

So what I guess I’m saying is don’t end up like the Neanderthals—without a storyverse, every piece of communication your consumer receives is just a shot in the dark, destined to be drowned out by the noise. Now all you have to do is make like the stone age: tell a story, one that’s catchy, emotional, relevant—or your brand, like them hapless Neanderthals, is in danger of dying out.


Download Episode 2 – A short history of stories

Download Episode 1 – Advertising in Tomorrowland

Or, if you missed it, catch up on the entire first season.


We Win Ad Age Agency of the Year…Again!

Mistress has been awarded AdAge Small Agency of the Year for the third time!

Each year, the AdAge Small Agency Awards uncover and honor the independent agencies that are producing groundbreaking work. Ad Age seeks the teams who are strategizing and executing ideas that directly compete with advertising’s oldest, largest, and most sought-after partners.

The 2018 winners were announced from the stage at the Small Agency Awards on July 18 in Los Angeles. Mistress won Silver overall, in the premier award category, earning its third Agency of the Year award in eight years.

Read the full press release to come.

The post We Win Ad Age Agency of the Year…Again! appeared first on Mistress.

Say Hi to Eric, Brittany, and Laura

We’re excited to onboard three new senior Mistresses—Laura Hoffman, Brittany Berman, Eric Privott.

Laura joins the brand group as Brand Director and will lead a number of accounts including Campbell’s Fresh and Pearson. Laura comes to Mistress by way of 72andSunny LA, where she worked on Starbucks, Tillamook, and Dropbox. Laura loves food, is a burgeoning surfer, carries three pairs of glasses, and tried camping once. In other words, she lives the LA dream.

Eric also joins Mistress as Brand Director and will lead the charge on recently-won iFLY, as well as NOS/Full Throttle energy drinks. As an 11-year Martin Agency vet, he’s worked on brands including GEICO, Walmart, UPS, Purina, and Comcast. Eric is an overall sports enthusiast, particularly baseball, which he played at the collegiate level. As husband and father of two, and new Angeleno, he looks forward to supporting the Dodgers at Chavez Ravine with his family.

Brittany, formerly at Saatchi & Saatchi, is welcomed as a Senior Strategist. Having worked on multiple facets of the Toyota brand, she will apply her background in consumer research and analysis on behalf of Mistress’s Food & Beverage portfolio, as well as ibotta, Walmart, and Campbell’s Fresh. She also the proud puppy parent of Ryder, the latest (and smallest) Australian Shepherd to join the Mistress dog pack.

“This has been a really big year for Mistress. Not only are we humbled by the caliber of talent we’re attracting, we’re excited by the momentum that continues to build,” says Amir Haque, partner at Mistress. “We see great things on the horizon.”

Check them out in AgencySpy.

The post Say Hi to Eric, Brittany, and Laura appeared first on Mistress.

PRM Academy Update Summer 2018

Having launched the PRM Academy last year we have seen significant uptake and interest in the programme from employees at all levels across our business.

The PRM Academy is a City & Guilds accredited programme that awards participants digital badges once evidence they have the necessary knowledge, skills and behaviours required. The badges cover all areas of our business, from data protection and client sectors to lead generation and client servicing.

Since November 2017 we have awarded a total of 232 badges, with a further 19 awaiting approval and 17 in progress. Our first graduation will take place in November 2018, where participants who have completed their year-long programme will be awarded the overall Golley Slater PRM Academy completion badge.

Client feedback on our academy has been excellent, with several clients looking to invest in their own academy programmes as well as exploit the PRM Academy as much as possible. Our plan over the coming year is to develop ‘PRM Academy Plus’, which will provide our clients access to the Golley Slater workforce already working on their accounts and move them into field sales roles, having already proven themselves capable to engage, develop relationships and close sales with new prospects.

The post PRM Academy Update Summer 2018 appeared first on Golley Slater.