Reading time 4 min

What sort of things come to mind when thinking about building websites? If you were asked this question 20 years ago, your answer would have been substantially different from what you’d likely say today. Back then, you might have talked about the separation of stylistic components (CSS) from HTML or even the release of Javascript. Today, you’re more likely to reference website building tools, dynamic pages, Single Page Apps, Shadow DOM, or even Web Components. While it’s necessary to look back at the history of the web and see how far we’ve come, it’s equally critical to understand where the future lies and how to be a part of it.

Introducing Web Components

For developers, reusing code wherever possible saves both time and money. With the way websites were built in the past, organizing complex HTML and CSS styles could get messy and cause problems. The Web Components suite seeks to solve these problems.

There are three main technologies to consider when talking about Web Components:

  1. Custom elements: A series of Javascript APIs that allows you to create custom elements and their characteristics.
  2. HTML templates: The elements <template> and <slot> allow you to create reusable markups for custom elements and can then be repeated as the foundation of a custom element’s structure.
  3. Shadow DOM: A hidden part of the DOM that enables Web Components to operate and function properly and allows us to obscure certain elements from end users. In short, this gives users the ability to create a private or public API.

With these functionalities, Web Components can relate Javascript with templates, import custom elements, and organize CSS styling and DOM encapsulation.

The Friendly Ghost of the DOM

Casper the Friendly Ghost = Shadow DOM
No, we are not talking about Casper. We’re referencing one of the great technologies associated with Web Components: The Shadow DOM.

The Shadow DOM can be described as a scoped DOM tree within an element, or “a DOM within a DOM.” It enables Web Components to be self-sustained in a page. By creating this encapsulation, pages can have individual components that are stylized within themselves and can be manipulated, changed, or created without affecting the other components.

Advantages and Disadvantages of Using Web Components

Today, websites are being built with more functionality than ever before, but as we all know, “With great functionality comes even greater responsibility.”

First, to address the great functionalities, Web Components place a strong emphasis on using Single Page Applications (SPAs). These SPAs are established entirely on both Javascript and Web Components. Because these applications are so Javascript heavy, they can be challenging to create and often require tedious scrutiny of code and stylizations. Web Components provide relief because they are reusable and are each isolated in their own Shadow DOM.

With Web Components, scrutinizing the JS code and taking valuable engineering time is not required because these components are easily replicated and are independent of each other. This saves businesses money because the components can be reused in other parts of the application, which leads to better and faster builds and deploys.

While all of this is fine and dandy, it is important to remember the great responsibilities that come with using Web Components, particularly with scripting. Because each of these modular components is isolated in the page’s Shadow DOM, it can cause problems for scripting. Writing scripts may become more tedious because it is no longer as easy as selecting the element and then getting to the desired section of the page.

It's important to remember the great responsibilities that come with using Web Components, particularly with scripting. #webperf #shadowdom #webcomponents #jsselector #javascript Click To Tweet

The Shadow DOM is somewhat concealed in the DOM, so to select an attribute inside of a Web Component, the script would need to have multiple steps to switch between the Shadow DOM and then return the desired element. This means engineers are writing more Javascript, and your automatic scripting process becomes a little more difficult to manage.

The Evergreen Chrome Solution to Aid Automatic Scripting

Worried? Fear not! Google has come up with Javascript (JS) selectors, a solution to attack this scripting responsibility. JS selectors are used to select HTML elements based on their tag name, id, classes, types, attributes, values of attributes, and much more.
Evergreen Chrome Solution to Aid Automatic Scripting

Using JS selectors, located in Chrome DevTools, users can select elements inside the Shadow DOM, easily aiding the automatic scripting process.

In maintaining the evergreen status, Rigor has released JS selectors to make the automatic scripting process less tedious and more functional for our customers. When creating scripts for SPAs with Web Components, users originally needed to fill in an action to execute Javascript first, then manually fill in the code.

With the release of JS selectors, users only need to copy the JS path from the desired component of the page from Chrome DevTools and paste that into the Rigor platform. The Shadow DOM is no longer in the shadows! Check it out:

JS Path

To learn more about how the Rigor platform can painlessly dive deeper into the Shadow DOM, incorporate Web Components into your automatic scripting process, and easily monitor your single page applications, reach out now for a free trial.

Special thanks to contributing writer Gabriel Ferrari.

Suggested Blog Posts

The Perception Gap for Poor Web Performance

E-commerce revenue continues to grow,as consumers turn away from shopping in brick-and-mortar stores to shopping online.  However, many businesses are not prepared for this growth because they do not fully understand the market and how to invest in...

Read More

Using Browser Trends to Maintain a Better Site

Because of the multifarious nature of web clients today, it’s important to consider the usage statistics when designing, implementing, and managing your site. However, misconceptions often arise when determining what browsers to design...

Read More

Finding Causes of Intermittent Errors from Google Webmaster Tools

Google Webmaster Tools is a web service that allows webmasters to view the status of their sites as seen by Google and the Googlebot crawlers. In addition to indexing your site structure and content, the Googlebot crawlers also record data on perform...

Read More

Optimization Options not Always Optimal

Web designers and developers are always looking for ways to speed up their page load times. Yahoo has an excellent article written on the best practices for speeding up your page. How do you know if implementing one of their suggested practices will...

Read More