So, what the heck is a is a tag/pixel anyway? – A tag (or often called pixel) is a short snippet of javascript (code) that does something on your website. In the context of marketing/advertising tags and pixels, they are often collecting some information about the visitor to a website and their behavior on the site. This is then sent back to the respective marketing/advertising platform to be processed and reported.
Tags can be inserted manually & automatically. Almost all digital assets, campaigns & pass-through tags/pixels connecting to other devices need to be tracked, in order to assess performance, attribution, metrics, ROI, spend & eventually maintenance (Are tags working as they should). If you are generating reports or rely on reports using the above metrics you would want to be on top of tagging, the methodology, proper implementation, tag redundancy (leftover tags), & of course rock solid tagging governance, residing with a Digital COE. Tags can accumulate over time, contributing to a slow website performance & experience, contributing to drag & a negative User experience. Thus tag governance & cadenced tag audits are a must.
With the explosion of Tag Management Systems in the past few years, technical implementation and management of tags/pixels lie closer to those playing in analytics and making decisions from conversion and behavior data than ever before.
One of the most important parts of a Tag Governance Policy is the Tag Implementation Process. Below we will explore the ideal workflow, review process, request specifics, and QA practices to ensure not only that your tags stay compliant but also that all data is collected as expected from the moment new tags go live.
Implementation Workflow
The tag implementation process begins with a request to add a new tag. It is important that, regardless of who is requesting the tag be added to the site, several pieces of key information are provided. We’ll cover the necessary information for a request in the second section of this review.
Once the request is submitted with the required information, it is time to review the tag in question. This needs to include a review from the compliance, privacy, and performance perspectives. Once the go ahead is given from each of the required teams, the actual implementation phase can begin.
For the tag implementation, this should follow the general process outlined by the development team for site releases. Oftentimes the tag implementation process is able to be more dynamic and is not reliant on strict release schedules like site changes are, but a testing and QA process should still be followed. The implementation of tags should have a defined owner, be that an individual in a small organization or a team in a larger one. This is important to ensure tags are not indiscriminately added to the site, leading to governance and performance issues.
For testing, ideally there is a staging/dev environment for each domain. This should contain a unique Tag Management Container instance (either a unique container or a unique environment) specifically for the staging domain. Here the tag can be implemented and tested to ensure that both data is being collected as expected and that there are no unexpected impacts on user experience.
Following implementation and testing in the Staging environment, there should be a final review and approval process from the site owner side. Once final approval is granted, the tag can be published to the production environment of the TMS and should be monitored to ensure it continues behaving as required until the tag lifecycle comes to an end.
Required Information for a Tag Request
The most important part of the tag implementation process is obtaining all of the required documentation to review tags and make it easy on those in the implementation role to quickly configure and deploy tags. The foundation of this is the Tag Request Documentation.
Here’s a generic example of the key information to start with:
Business Information:
Technical Information:
Business/Context Information in the Request
- Pixel Group Name: This should be the name of the tag provider. E.g. Facebook Pixel, Google Analytics, Adobe Analytics, Krux, etc.
- Number of Pixels: The number of unique tags included in the request. For many platforms there will be a general all-pages tag but then additional tags for tracking specific events on a site. This will be true for media tags, analytics tags, and many marketing tags. The number of tags included in the request allows the implementation team to better understand the scope and provide accurate estimates for time-to-deploy.
- Brand: Helpful in a multi-brand organization, this allows you to understand all of the sites/domains on which the tag needs to be implemented. Often you will have one tag type that needs deployed across several sites. Here you can simplify this request all in one process.
- Owner: The business owner of the platform associated with the tag. This should be the individual that owns the relationship with the third-party and is responsible for any clarifications necessary when it comes to deployment requirements as well as future updates to the tag/platform.
- Agency: If the request if from a third-party agency (likely in the case of media tagging), the agency name and primary contact at the agency should be included. Similar to the business owner above, this person will be responsible for future questions about the tag implementation and if the platform is still needed in the future.
- Business Description: A quick description of what the tag is used for and the business value of implementing the platform. This will help in the general Tag Governance Process to see if there are any opportunities to combine efforts across platforms and reduce the technical debt of tags on the site.
- Tag/Pixel Life Cycle: There must be a start date (the date the tag needs to be deployed) and an end date associated with each tag. Many media tags will only need to be implemented for specific campaigns or initiatives. If there is a known end date for these campaigns, then the tag should be removed at the conclusion. If there is not a defined end date then we recommend having the “end date” set to either 12 or 18 months from the time of implementation. This will force a review of the tag at the defined interval and goes a long way in reducing the risk of legacy tags remaining on the site when they are no longer needed.
- Additional Requirements to Consider:
- Legal/Compliance Approval: Depending upon the system being used for the management of tag implementation requests, it might make sense to have a checkbox to indicate legal/compliance review and approval of the tag in question. With the privacy environment becoming more strict seemingly every month this review and approval is imperative for your organization.
- Performance Approval: All tags being implemented on your website should be reviewed for performance impact on the site. At the end of the day, data collection matters little if it hinders the user experience of your site. A checkbox indicating the review and approval of the tag being requested for performance considerations can be added to more easily manage this process.
- Security Approval: As has been seen regularly over the past several years, data breaches are a big issue when it comes to digital security. It is possible for malicious code to be introduced to a site via a javascript tag. Security should review and approve all new tag requests for new platforms being introduced to the site to ensure no malicious activity will take place.
Review Process for a New Tag
Before implementation work begins to deploy a new tag to the website, there must be a thorough review process from the legal/compliance, performance, and security teams. In most organizations this review will happen at the platform level for any new technology being onboarded. Once this review is complete, the platform along with the standard tag scripts can be added to the approved list within a Tag Governance Policy. The tags for the associated platforms from that point forward should still go through a light review prior to implementation, but the full review is typically not necessary to maintain efficiencies in the process. All tags in the Governance Policy should be reviewed on an annual basis, as scripts and configurations will often change for various platforms.
Some of the parts to consider in the review process for each team are as follows:
- Legal/Compliance: Here all platforms should be assessed for the data being collected. Is the data considered “Personal Data” under GDPR? If so, are the proper user protections in place? Review any Data Sharing Agreements for usage of data by the third party. Any other industry-specific laws and regulations should also be reviewed to ensure the platform being added is not going to put the organization in a position of risk.
- Performance: The biggest consideration here is if the tag is going to affect user experience in any way, to what extent, and if the performance cost is worth the business benefit realized by use of the platform in question. For this review, the team should assess the load behavior of the tag, if it will block any other requests while executing, the additional weight it adds to a page, etc.
- Security: It is imperative to understand what data is being collected and where it is being sent. From there, you must know how it will be used and all technical measures in place to ensure the security of user’s collected data. The last thing you want is for a data breach to occur as a result of a marketing or media platform being added to your site.
At the conclusion of the review process, each of the involved teams should offer their approval and at that point the platform can be added as an approved vendor in the Tag Governance Policy and future tag requests for the platform can be evaluated via a lighter process simply to ensure no changes to the tags would cause concern in the above teams’ view. This lighter review can be handled by the tag implementation team during QA.
QA Process for Implementation
Following the approval of a new tag to be added to the site, we move into the actual implementation phase. The QA process should follow the framework followed by the site development team. This does not mean that tag deployments must be included in the regular site release schedule. Oftentimes one of the main benefits of utilizing a Tag Management System is the ability to be more dynamic than site updates and reduce the time to deployment. Tags should still be reviewed in the development/staging environments just as on-site changes are and considered as regular site usability testing is happening in those environments.
If you are using a Tag Management System to deploy tags (and if you are not, you should be), we advise clients to have a unique container or environment dedicated to the development/staging environment for all sites. Initial tag configuration should be done here to reduce the risk of any implementation issues affecting user experience and/or data collection. Configure the tag based upon the requirements defined in the technical documentation, publish to the development/staging environment, and do testing to ensure the tag is firing, required data is passed, no js errors are thrown, and user experience is not negatively affected. Using a Tag Auditing Platform such as Tag Inspector can help significantly in this process.
Once the implementation team has published and completed initial QA in staging/development, it is ideal to have someone from the technical team of the vendor being implemented to then do a round of testing and provide final approval. This can help clear up any issues and ensure data is collected as expected from the time the tag goes live.
Following the final approval to deploy, publish the same configuration to the Production environment/container. A final phase that is often overlooked is to then do some final verification and testing in the live environment. Check to make sure the tag is behaving as expected and, finally, sign off on the implementation.
Once fully live, it is important to have a regular review and monitoring process in place to be aware of any future site changes that affect your tag architecture. Here again, a tool such as Tag Inspector can automate and greatly simplify this process. C
Documentation for Tag Governance
Following the above processes will get you a very good start to maintaining your Tag Governance Policy. It is also important to get and organize several pieces of documentation throughout the implementation process to stay on top of Tag Governance. These documents will include any contracts with the vendor associated with the tag, technical documentation about the tag, any Data Sharing Agreements in place, the Tag Implementation Request documentation, and any documentation from the review process of the tag/platform.
Keeping all of these documents organized and in one central portal for Tag Governance will make future review and audit processes much simpler. You must always have visibility into the platforms on the site, data
Courtesy: S. Ernest Paul & TagInspector