This kind of new physics signature has been long known and searched for. For instance, constraints on anomalous couplings of the gauge bosons were set already at LEP. Today, thanks to modern Effective Field Theory (EFT) techniques developed in the past decade, these studies can be organized in a coherent, unified program. The key innovation is that ad-hoc parameterizations and assumptions are replaced by a common theory framework: the so-called Standard Model Effective Field Theory (SMEFT). This will allow to extend indirect searches to a wide range of processes, and to derive results that are better defined and apply to a vast class of beyond-SM (BSM) scenarios. The adoption of the SMEFT language will also facilitate the re-interpretation of LHC measurements in the future.
In essence, an EFT is a field theory that approximates another one in a specific dynamic regime. The concept is so intuitive that, in fact, its roots were planted even before Quantum Field Theory was formulated.The earliest example dates back to 1933, when Enrico Fermi proposed a model of beta-decays  based on a local interaction among four spin ½ particles: a neutron, a proton, an electron and a neutrino, that at the time was just an hypothetical particle. Today we know that this interaction actually involves the nucleons’ fundamental constituents, the quarks, that it is mediated by the W boson and much more. Fermi’s theory ignored all this, but still proved very successful in describing beta decays. The reason is that the typical energy E exchanged in these processes is always significantly smaller than the W mass: in this limit, the weak interactions effectively become point-like. Effects sensitive to the finer SM structure are suppressed by additional powers of (E/mW) << 1, and can thus be revealed only with a high experimental resolution.
At the same time, the breadth of scope of the SMEFT comes at the price of a large number of parameters: the leading deviations from the SM are induced by operators of dimension 6. At this order there are 2499 independent real parameters. The number can be reduced to ~80 if the flavor structure is maximally simplified. A significant subset of these parameters contributes to many processes involving all sorts of SM particles. Conversely, any given process typically receives corrections from 10-20 different operators, some of which lead to fully degenerate effects.
This poses a challenge for modeling BSM signals, as well as for the interpretation of experimental measurements. No individual analysis can constrain independently all the EFT parameters contributing to a process of interest. On the other hand, a reduction a priori of the parameter space would inevitably introduce a bias. The preferred strategy is then to combine measurements of multiple processes, to constrain a large number of degrees of freedom simultaneously, in a global statistical analysis.
Due to the complexity of the SMEFT, this combination should ideally include measurements from different sectors, for instance EW, Higgs and top quark processes. The more inclusive is the global analysis, the more reliable and general the results can be. This is an unprecedented challenge for the LHC experiments, and it will require, for instance, a set of common definitions for the modeling of both signals and backgrounds, and a consistent treatment of theoretical uncertainties. Last summer, the LPCC established a dedicated working group, the LHC EFT WG , that will operate as a platform for the coordination activities.
The ATLAS and CMS experiments have already begun addressing some of these challenges: an increasing number of analyses of top quark, Higgs and EW processes report an EFT interpretation of their results. First steps towards combinations within each sector have also been taken already: for instance, already 3 years ago, the Top WG released a document with recommendations for SMEFT analyses of top quark processes .
At present, much effort is being put into developing optimal strategies for handling the large number of parameters in the signal modeling and in the statistical analyses. The ATLAS Collaboration has carried out studies for the specific case of Simplified Template Cross Sections (STXS) for Higgs measurements  and last year it presented the results of the first global analysis of this class of observables, using its full Run-2 data set and including 16 SMEFT operators . The same number of degrees of freedom (although of different nature) has been constrained more recently in the first global analysis of top quark production processes by the CMS Collaboration .
In the upcoming months, the ATLAS and CMS experiments will start gearing up for their first “super-combination” that, in the long run, can include up to 50 - 60 free parameters. The LHCb experiment is also likely to take part in this program, contributing in particular with B-physics measurements. It will likely be a long road, along which these analyses will gradually become more sophisticated and more comprehensive. Ultimately, the SMEFT challenge can be won with a community effort, in which theory and experiment join forces to develop the required infrastructure: many contributions are needed, from the reduction of theoretical uncertainties to the selection of optimal observables, from matching techniques to connect the SMEFT to BSM models to the implementation of efficient frameworks for simulations and global statistical analyses.
The SMEFT framework allows to incorporate naturally measurements from non-LHC experiments as well, at least at the level of a phenomenological study. This has been done already for data collected, for instance, at LEP and Tevatron. The addition of measurements of lower-energy observables, such as electric or magnetic dipole moments (g-2) or meson oscillations and decays, would be crucial for constraining the flavor structure of the SMEFT. New results from upcoming experiments will also be easily included in the future. In this way, the SMEFT can serve as a flexible theoretical framework for the analysis of HEP measurements and their future re-interpretation.