fix: prevent asset conflicts between React and Grid.js versions

Add coexistence checks to all enqueue methods to prevent loading
both React and Grid.js assets simultaneously.

Changes:
- ReactAdmin.php: Only enqueue React assets when ?react=1
- Init.php: Skip Grid.js when React active on admin pages
- Form.php, Coupon.php, Access.php: Restore classic assets when ?react=0
- Customer.php, Product.php, License.php: Add coexistence checks

Now the toggle between Classic and React versions works correctly.

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
This commit is contained in:
dwindown
2026-04-18 17:02:14 +07:00
parent bd9cdac02e
commit e8fbfb14c1
74973 changed files with 6658406 additions and 71 deletions

View File

@@ -0,0 +1,36 @@
---
name: faqs
---
### I don't see entity X in the list. What's up with that?
This can be for one of several reasons:
1. The entity does not have references to their origin on at least 50 pages in the dataset.
1. The entity's origins have not yet been identified. See [How can I contribute?](#contribute)
### What is "Total Occurences"?
Total Occurrences is the number of pages on which the entity is included.
### How is the "Average Impact" determined?
The HTTP Archive dataset includes Lighthouse reports for each URL on mobile. Lighthouse has an audit called "bootup-time" that summarizes the amount of time that each script spent on the main thread. The "Average Impact" for an entity is the total execution time of scripts whose domain matches one of the entity's domains divided by the total number of pages that included the entity.
```
Average Impact = Total Execution Time / Total Occurrences
```
### How does Lighthouse determine the execution time of each script?
Lighthouse's bootup time audit attempts to attribute all toplevel main-thread tasks to a URL. A main thread task is attributed to the first script URL found in the stack. If you're interested in helping us improve this logic, see [Contributing](#contributing) for details.
### The data for entity X seems wrong. How can it be corrected?
Verify that the origins in `data/entities.js` are correct. Most issues will simply be the result of mislabelling of shared origins. If everything checks out, there is likely no further action and the data is valid. If you still believe there's errors, file an issue to discuss futher.
<a name="contribute"></a>
### How can I contribute?
Only about 90% of the third party script execution has been assigned to an entity. We could use your help identifying the rest! See [Contributing](#contributing) for details.

View File

@@ -0,0 +1,9 @@
---
name: goals
---
1. Quantify the impact of third party scripts on the web.
1. Identify the third party scripts on the web that have the greatest performance cost.
1. Give developers the information they need to make informed decisions about which third parties to include on their sites.
1. Incentivize responsible third party script behavior.
1. Make this information accessible and useful.

View File

@@ -0,0 +1,5 @@
---
name: methodology
---
[HTTP Archive](https://httparchive.org/) is an initiative that tracks how the web is built. Every month, ~4 million sites are crawled with [Lighthouse](https://github.com/GoogleChrome/lighthouse) on mobile. Lighthouse breaks down the total script execution time of each page and attributes the execution to a URL. Using [BigQuery](https://cloud.google.com/bigquery/), this project aggregates the script execution to the origin-level and assigns each origin to the responsible entity.

151
node_modules/third-party-web/lib/markdown/template.md generated vendored Normal file
View File

@@ -0,0 +1,151 @@
# [Third Party Web](https://www.thirdpartyweb.today/)
## Check out the shiny new web UI https://www.thirdpartyweb.today/
Data on third party entities and their impact on the web.
This document is a summary of which third party scripts are most responsible for excessive JavaScript execution on the web today.
## Table of Contents
1. [Goals](#goals)
1. [Methodology](#methodology)
1. [npm Module](#npm-module)
1. [Updates](#updates)
1. [Data](#data)
1. [Summary](#summary)
1. [How to Interpret](#how-to-interpret)
1. [Third Parties by Category](#by-category)
<%= category_table_of_contents %>
1. [Third Parties by Total Impact](#by-total-impact)
1. [Future Work](#future-work)
1. [FAQs](#faqs)
1. [Contributing](#contributing)
## Goals
<%= partials.goals %>
## Methodology
<%= partials.methodology %>
## npm Module
The entity classification data is available as an npm module.
```js
const {getEntity} = require('third-party-web')
const entity = getEntity('https://d36mpcpuzc4ztk.cloudfront.net/js/visitor.js')
console.log(entity)
// {
// "name": "Freshdesk",
// "homepage": "https://freshdesk.com/",
// "category": "customer-success",
// "domains": ["d36mpcpuzc4ztk.cloudfront.net"]
// }
```
## Updates
<%= updates_contents %>
## Data
### Summary
Across top ~4 million sites, ~2700 origins account for ~57% of all script execution time with the top 50 entities already accounting for ~47%. Third party script execution is the majority chunk of the web today, and it's important to make informed choices.
### How to Interpret
Each entity has a number of data points available.
1. **Usage (Total Number of Occurrences)** - how many scripts from their origins were included on pages
1. **Total Impact (Total Execution Time)** - how many seconds were spent executing their scripts across the web
1. **Average Impact (Average Execution Time)** - on average, how many milliseconds were spent executing each script
1. **Category** - what type of script is this
<a name="by-category"></a>
### Third Parties by Category
This section breaks down third parties by category. The third parties in each category are ranked from first to last based on the average impact of their scripts. Perhaps the most important comparisons lie here. You always need to pick an analytics provider, but at least you can pick the most well-behaved analytics provider.
#### Overall Breakdown
Unsurprisingly, ads account for the largest identifiable chunk of third party script execution.
![breakdown by category](./by-category.png)
<%= category_contents %>
<a name="by-total-impact"></a>
### Third Parties by Total Impact
This section highlights the entities responsible for the most script execution across the web. This helps inform which improvements would have the largest total impact.
<%= all_data %>
## Future Work
1. Introduce URL-level data for more fine-grained analysis, i.e. which libraries from Cloudflare/Google CDNs are most expensive.
1. Expand the scope, i.e. include more third parties and have greater entity/category coverage.
## FAQs
<%= partials.faqs %>
## Contributing
### Thanks
A **huge** thanks to [@simonhearne](https://twitter.com/simonhearne) and [@soulgalore](https://twitter.com/soulislove) for their assistance in classifying additional domains!
### Updating the Entities
The domain->entity mapping can be found in `data/entities.js`. Adding a new entity is as simple as adding a new array item with the following form.
```js
{
"name": "Facebook",
"homepage": "https://www.facebook.com",
"category": "social",
"domains": [
"*.facebook.com",
"*.fbcdn.net"
],
"examples": [
"www.facebook.com",
"connect.facebook.net",
"staticxx.facebook.com",
"static.xx.fbcdn.net",
"m.facebook.com"
]
}
```
### Updating Attribution Logic
The logic for attribution to individual script URLs can be found in the [Lighthouse repo](https://github.com/GoogleChrome/lighthouse). File an issue over there to discuss further.
### Updating the Data
This is now automated! Run `yarn start:update-ha-data` with a `gcp-credentials.json` file in the root directory of this project (look at `bin/automated-update.js` for the steps involved).
### Updating this README
This README is auto-generated from the templates `lib/` and the computed data. In order to update the charts, you'll need to make sure you have `cairo` installed locally in addition to `yarn install`.
```bash
# Install `cairo` and dependencies for node-canvas
brew install pkg-config cairo pango libpng jpeg giflib
# Build the requirements in this repo
yarn build
# Regenerate the README
yarn start
```
### Updating the website
The web code is located in `www/` directory of this repository. Open a PR to make changes.

View File

@@ -0,0 +1 @@
Huge props to [WordAds](https://wordads.co/) for reducing their impact from ~2.5s to ~200ms on average! A few entities are showing considerably less data this cycle (Media Math, Crazy Egg, DoubleVerify, Bootstrap CDN). Perhaps they've added new CDNs/hostnames that we haven't identified or the basket of sites in HTTPArchive has shifted away from their usage.

View File

@@ -0,0 +1 @@
Almost 2,000 entities tracked now across ~3,000+ domains! Huge props to [@simonhearne](https://twitter.com/simonhearne) for making this massive increase possible. Tag Managers have now been split out into their own category since they represented such a large percentage of the "Mixed / Other" category.

View File

@@ -0,0 +1 @@
Google Ads clarified that `www.googletagservices.com` serves more ad scripts than generic tag management, and it has been reclassified accordingly. This has dropped the overall Tag Management share considerably back down to its earlier position.

View File

@@ -0,0 +1,14 @@
A shortcoming of the attribution approach has been fixed. Total usage is now reported based on the number of _pages_ in the dataset that use the third-party, not the number of _scripts_. Correspondingly, all average impact times are now reported _per page_ rather than _per script_. Previously, a third party could appear to have a lower impact or be more popular simply by splitting their work across multiple files.
Third-parties that performed most of their work from a single script should see little to no impact from this change, but some entities have seen significant ranking movement. Hosting providers that host entire pages are, understandably, the most affected.
Some notable changes below:
| Third-Party | Previously (per-script) | Now (per-page) |
| ----------- | ----------------------- | -------------- |
| Beeketing | 137 ms | 465 ms |
| Sumo | 263 ms | 798 ms |
| Tumblr | 324 ms | 1499 ms |
| Yandex APIs | 393 ms | 1231 ms |
| Google Ads | 402 ms | 1285 ms |
| Wix | 972 ms | 5393 ms |

View File

@@ -0,0 +1 @@
Due to a change in HTTPArchive measurement which temporarily disabled site-isolation (out-of-process iframes), all of the third-parties whose work previously took place off the main-thread are now counted _on_ the main thread (and thus appear in our stats). This is most evident in the change to Google-owned properties such as YouTube and Doubleclick whose _complete_ cost are now captured.