Tag: advertising

The Trade Desk tackles in-app viewability measurement with InMobi and Rubicon Project


Ad fraud is on everyone’s lips these days.

Whether we’re talking about off-screen mobile ads, fake clicks, or viewability issues, the industry is rife with problems. The good news? The industry is — piece by piece — coming together to right these wrongs.

hack telegram ios

Today, The Trade Desk joins the fray by announcing a collaboration with InMobi and Rubicon Project, creating what the company claims is a first-to-market in-app viewability measurement solution.

It has been widely accepted that there aren’t any simple ways to measure in-app viewability, which makes this announcement particularly important. So, how does this solution differ from other, more complex options?

“A key difference in our joint solution is that InMobi is the first SSP to enforce the adoption of a provider’s viewability measurement SDK and then audit their publishers for integration completeness and version control,” Tim Sims, SVP of inventory partnerships at The Trade Desk, told me. “Objective, third-party measurement is critical to making sure no one is grading their own homework.”

The alliance between DSPs and supply partners allows buyers to activate viewability metrics on in-app campaigns at scale.

But how does the solution work, and what do advertisers and marketers need to do to implement it?

“The Trade Desk places a third-party viewability measurement tag at bid time and can measure in-app viewability in a way similar to how web measurement is achieved,” Sims said. “With this integration, a differentiator is that app impressions will now be included in all optimization levers that are available in The Trade Desk’s platform. This has been one of the biggest hurdles that has prevented the growth and adoption of in-app traffic, and we plan to change that.”

Others have attempted to solve the in-app viewability issue before now. Integral Ad Science developed an open-source SDK that vendors can use for in-app verification, for example. Does this bear any relationship to that project?

“The open source SDK effort is not part of this project,” Sims said. “However, as the adoption of this SDK grows, it will complement our solution. With the bid request signal that InMobi is passing, they are able to tell us that any supported SDK is present, and we can bid with the right tag. For instance, InMobi currently sends eligibility for other vendors as well, and Integral Ad Science is one of them. Thus, this new approach offers a scalable way to signal back to us any new vendor.”

All of this is a step in the right direction for an industry that urgently needs transparency and reliable measurement solutions.

“It is critically important that advertisers have transparency, as well as consistent, independent measurement for all their media,” said Anne Frisbie, SVP and GM of global brand and programmatic at InMobi. “By working with The Trade Desk to support MRC-accredited viewability for mobile in-app programmatic buying, we are a step closer to fulfilling this promise. This will help advertisers gain a greater understanding of the value of in-app advertising, including in-app video advertising.”

So what is the future for in-app viewability measurement, and what are the next best steps the industry can take to reduce fraud?

How to use emotion AI for all the right reasons

Whatsapp hack 2017

Artificial intelligence (AI), data mining, expert system software, genetic programming, machine learning, deep learning, neural networks and another modern computer technologies concepts. Brain representing artificial intelligence with printed circuit board (PCB) design.

As artificial intelligence (AI) grows, its ability to understand and respond to your emotions is key. If machines, robots, and technology are to make better, more contextual judgments of human behaviors, the next step is ultimately Emotion AI.

While emotion AI enhances the human computer interaction, enables brands to gain emotional insight in real-time and helps professional sports stars assess & improve their performance, its capabilities are limitless and how people could use it must be carefully considered.

Think emotional, think ethical

Just like with humans, we are now creating emotional relationships with machines. As brands, experts, researchers, and consumers, we all have a duty of care within this space. If we are going to use machine learning that is emotional to help us as brands, athletes, entertainers, and retailers, we must treat its use like we treat everyone else in society – with respect.

Don’t turn to the dark side

Let’s be honest, there are going to be people out there tempted to use it for the wrong reasons, perhaps for profiling and surveillance; and that’s when things could quickly get creepy and just downright scary. But there’s something we can all do to minimize this. Just because the technology can do certain things like this, doesn’t mean it needs to. As humans, let’s keep it cool, let’s use emotion AI to our advantage, BUT let’s not take advantage.

What goes around comes around

Whether it is with our partners, team members, coaches or customers, our strongest relationships are ultimately built on trust, openness, and honesty. So when it comes to our relationships with emotion AI, we must follow suit. Think of it like this – if you bring good to emotion AI, it will bring good to you.

The MUSTs

How we live and work within society will be underpinned by our values. As an emotion AI company, we have always believed in the importance of our end users and the privacy of their data. With everything we do, we will always:

  • Get consent
  • Be transparent
  • Be responsible
  • Be trustworthy
  • And most importantly, put the user first

If these values come naturally to you, then great. The use of emotion will do you well. With clear values of transparency in every project and product created, we are able to make them more useful, interesting or enjoyable for the end user. As a brand or retailer, you have the opportunity to lead by example. Trust is the new currency for customer loyalty; provide it, advocate it, and enjoy the benefits. If consumers can trust you on a genuine level, it will not only attract a bigger audience to your service but also increase the number of users willing to take your emotional relationship to the next step.

See also: Understanding the hype vs. reality around artificial intelligence

On the other hand, if you see these values more as guidelines, and decide not to follow them, then, unfortunately, emotion AI will catch up with you… and not in a good way. Recently, we have seen the likes of Facebook and YouTube getting publicly criticized for their irresponsible programmatic advertising, such as brands’ messages appearing beside extreme terrorist content. Not only has it caused their clients to lose trust in them, it has left them with a tarnished reputation that will take great efforts to resolve.

What needs to be considered here is the fact that it is only our social behavior that programmatic advertising is currently looking at. If emotional data comes into the equation, businesses like YouTube and Facebook must seriously step up their game and ensure consent, transparency and responsibility attributes are incorporated into their strategies at all time. Without these, technology may have gone too far. Emotion AI services will lose trust, receive negative perceptions and ultimately fail.

So what to take away from this?

To put it simply, don’t be an idiot. Emotion AI can bring so much good to society, so let’s consider our actions, use it correctly and provide creative, exciting & fun projects for the end users. For brands, the perceptions of current & potential customers are key to your success, so do right by them. To capitalize on the use of emotion AI, whether that’s in advertising, entertainment, sports & performance or health & well being, it must be done with trust and transparency.

We must admit, not knowing the limits of emotion AI and where it could go is a scary thing – but this fear must not be the overriding emotion. We should look at the positives of what it can do and work together to ensure it does not step into places we do not want it to exist. Those that let it enter the dark spaces, let’s hold them to account for it. After all, it will be those who use it badly that lose out in the long run.

Interestingly, while we may not even think about it, how the emotion AI algorithms are programmed and how people use it to engage with others is a human decision. Therefore, the control and use of emotion AI is in our hands. As long as we show empathy and remain sensitive to the use of this technology, it is an exciting space to watch and ultimately where the future of emotion AI will lie.

How to hack SMS

Do it well and do it right.