Reading view

Conductive smart hydrogels as battery electrolytes: Promising for lithium, sodium, and zinc-ion chemistries

Hydrogels offer promise in batteries as an electrolyte, including lithium and sodium chemistries, due to being inherently more safe.

From ESS News

Battery research in industry and acadaemia continues to advance ideas in electrodes and electrolytes, covering materials, designs, safety, efficacy, and green credentials. In most cases for lithium-ion batteries used in stationary storage, the use of potentially flammable organic electrolytes has been a persistent safety liability and one the industry is constantly countering through often complex mitigation efforts, and expensive and destructive testing.

A new review paper taking a systematic review of hydrogel research from 2008 to 2025, including 186 published studies over 17 years, makes the case that conductive hydrogels are a credible electrolyte candidate. The paper notes this is the case particularly for flexible and wearable applications, however, stationary storage and lithium and sodium are potential winners. The paper was published this week in the Journal of Electroanalytical Chemistry by researchers at the University of Limpopo in South Africa.

The safety argument is perhaps the most straightforward, hydrogel electrolytes are water-based, which removes the thermal runaway contribution of conventional organic electrolytes, and their structure means they also do not leak and can self-repair.

While at this stage the commercial aspects are not clear, the performance picture is promising though it varies significantly by chemistry. For lithium-ion, a silicon nanoparticle-polyaniline composite electrode using an in-situ polymerised hydrogel achieved 1,600 mAh/g over 1,000 deep cycles, with 99.8% average coulombic efficiency from the second cycle onward. First-cycle efficiency sat around 70%, a known issue for silicon anodes.

To continue reading, please visit our ESS News website.

  •  

All emerging cyber threats targeting power infrastructure at a glance

Researchers in Moroco analyzed cybersecurity challenges in smart grids, highlighting AI-driven detection and defense strategies against threats like distributed denial-of-service, false data injection replay, and IoT-based attacks. They recommend multi-layered protections, real-time anomaly detection, secure IoT devices, and staff training to enhance resilience and safeguard power system operations.

Researchers at Morocco's Higher School of Technology, Moulay Ismail University, have conducted a comprehensive analysis of emerging cybersecurity challenges in power systems and detailed recent advances in detection and defense strategies.

Their work emphasizes the growing role of AI in enhancing control, protection, and resilience in modern smart grids. It also classifies cyber threats by origin, impact, and affected system layers to provide a structured understanding and reviews machine learning and optimization-based intrusion detection systems (IDSs) for power systems.

The researchers highlighted that renewable smart grids face diverse cyber threats that can disrupt operations and compromise data. Distributed denial-of-service (DDoS) attacks, for example, flood networks with traffic, blocking legitimate access and delaying control actions, while data integrity attacks manipulate sensor or control data, causing incorrect decisions or blackouts.

Additionally, replay attacks retransmit intercepted data to confuse the system, and false data injection attacks subtly alter real-time data to mimic normal operations while disrupting the grid. Covert attacks inject hidden signals that manipulate system behavior without detection, whereas IoT device-based attacks exploit vulnerabilities in meters or sensors to spread malware, steal data, or launch DoS attacks.

Finally, zero dynamics attacks leverage system models to generate hidden signals that leave output measurements unchanged but affect operations, posing sophisticated stealth threats to smart grid security.

 Do you want to strengthen and enhance the cyber security of your solar energy assets to safeguard them against emerging threats?

Join us on Apr. 29 for pv magazine Webinar+ | Decoding the first massive cyberattack on Europe’s solar energy infrastructure – The Poland case and lessons learned

The researchers warned that while smart grids have improved energy efficiency and flexibility through advanced communication tools and distributed energy sources, they have also introduced new cyber vulnerabilities. Threats such as phishing, malware, denial-of-service (DoS) attacks, and false data injection (FDI) can disrupt operations, compromise data, and damage infrastructure.

They recommend implementing defense strategies that maintain confidentiality, integrity, and availability, while also incorporating authentication, authorization, privacy, and reliability. Machine learning and data-driven intrusion detection systems can help identify anomalies and detect FDI attacks in real time, particularly in smart grids and industrial control systems such as SCADA, which rely on accurate sensor measurements for state estimation.

The research team also encouraged energy asset owners and grid operators to adopt substation security measures and protocol vulnerability analyses to detect risks at the hardware and network levels. Blockchain, distributed ledgers, and Hilbert-Huang transform methods are highlighted as tools to further strengthen cybersecurity.

IoT devices, including sensors and smart meters, should be secured with strong authentication, safe boot procedures, frequent firmware updates, and standardized security across manufacturers. Sensitive grid data should be protected using techniques such as homomorphic encryption to maintain confidentiality during storage and transmission.

“A multi-tiered security approach that includes firewalls, intrusion detection systems, and network segmentation can enhance grid resilience. Extracting critical elements from vulnerable IoT devices and leveraging redundant control channels ensures operational continuity during attacks,” the researchers stated.

Machine learning and anomaly detection systems should be deployed to enable real-time identification of irregular activities, including FDI and malware propagation. Standardized protocols and rapid incident response measures should also support collaboration among grid operators, IoT manufacturers, and regulators, facilitated by information-sharing platforms.

The researchers emphasize that human-centered attacks, including phishing and social engineering, remain significant threats, but these can be mitigated through regular staff and user training.

The review was presented in “Cybersecurity challenges and defense strategies for next-generation power systems,” published in Cyber-Physical Energy Systems.

 

 

  •  

The impact of annealing on copper-plated heterojunction solar cells

A UNSW-led team found that annealing conditions significantly affect stress, strain, and microstructure in copper-plated heterojunction solar cell contacts, with fast annealing increasing microstrain in both copper and indium tin oxide.

A team of scientists led by Australia's University of New South Wales (UNSW) has studied how stress and strain evolve in copper (Cu)-plated contacts on heterojunction (HJT) solar cells under various annealing conditions. Their work specifically examined how annealing affects the material properties of Cu, indium tin oxide (ITO), and silicon (Si).

“We applied multiple characterization methods to understand how annealing conditions influence stress and strain in Cu-plated HJT cells,” co-author Pei-Chieh Hsiao told pv magazine. “Our results show that Cu contacts on HJT cells need careful assessment to balance adhesion with mechanical integrity.”

Hsiao highlighted the importance of controlling the microscopic structure of copper contacts to limit mechanical stress in HJT solar cells. “Ideally, plated Cu with a low defect density and (100) crystal texture is preferred,” he explained. “This reduces stress in Si after annealing because of a lower Young’s modulus. The preferred texture can be achieved by adjusting the electrolyte or plating parameters, and annealing can then be optimized to minimize thermal strain while preserving the (100) orientation.”

The team began with silicon heterojunction G12 half-cut n-type precursors measuring 210 mm × 105 mm. The cells were coated with a resin-based mask to restrict copper plating, with selective openings created via a collimated light source. Copper was then plated onto the exposed ITO surface using an acid-based electroplating solution at a current density of 42 mA/cm².

The team compared three annealing methods. In self-annealing, samples were stored at room temperature in a low-humidity environment. Fast annealing (same day) was carried out in compressed dry air at 205 ± 5 C for 45 seconds under approximately 15 suns of illumination. Fast annealing (next day) used the same conditions but was performed roughly 24 hours after plating.

Cross-sectional focused ion beam (FIB) image of a Cu-plated contact on an HJT cell after self-annealing.

Image: University of New South Wales, Sydney, Solar Energy Materials and Solar Cells, CC BY 4.0

“Due to the limitation of low temperature processing of HJT cells, fast annealing was performed at 200 C, which is lower than the grain growth stage at over 250 C,” Hsiao said. “It means that annealing of plated Cu contacts on HJT cells would perform distinctly from that on PERC or TOPCon cells, where higher annealing temperatures are permitted and improved contact adhesion has been demonstrated.”

The team then examined the samples in a series of tests. First, nanoindentation was used to measure the mechanical strength and stiffness of the plated copper. Second, X-ray diffraction (XRD) was used to examine the crystal structure of the copper and the underlying ITO layer. Finally, Raman spectroscopy was used to map the mechanical stress induced by the copper contacts in the silicon, especially near the contact edges.

The analysis showed that no significant differences were found in yield strength or plastic response of plated Cu, which was consistent with the comparable Cu grain size. Moreover, XRD patterns showed fast annealing reduced the Cu lattice parameter and promoted grain growth in the Cu (200) crystallographic orientation, while simultaneously increasing the ITO lattice parameter and full width at half maximum (FWHM).

As a result, microstrains in both Cu and ITO rose under rapid annealing, with the scientists noting that Raman spectroscopy revealed approximately 2 μm-wide regions of high local stress in the silicon along the plated Cu fingers, with stress being lower in self-annealed Cu and higher in fast-annealed Cu.

These results indicate that minimizing defects and promoting a preferential (100) texture in plated Cu can reduce stress transfer to Si and ITO. Maintaining uniform plating conditions and careful surface preparation are also essential for achieving optimal texture and adhesion. Overall, self-annealing is preferred when comparable contact adhesion can be achieved, as it preserves the (100) orientation and minimizes thermal strain.

The research work was described in “Stress and strain analysis of Cu plated contacts on HJT cells under different annealing conditions,” published in Solar Energy Materials and Solar Cells. Scientists from Australia's University of New South Wales and technology company SunDrive Solar have contributed to the research.

In early January, a research team from UNSW and Chinese-Canadian solar module maker Canadian Solar investigated how HJT solar cells are hit by sodium (Na) and moisture degradation under accelerated damp-heat testing and has found that most degradation modes predominantly affect the cells themselves, making cell-level testing the preferred approach.

A month later, another UNSW team assessed the impact of soldering flux on HJT solar cells and found that the composition of this component is key to prevent major cracks and significant peeling.

  •  

New Time targets large-scale perovskite production in Italy

New Time has outlined a four-year roadmap to industrialize perovskite solar cells in Italy, with pilot production planned within three years and full-scale output to follow.

From pv magazine Italia

New Time has outlined plans to industrialize perovskite PV production in Italy, following a two-day strategic meeting in Forlì to advance the project in the Emilia-Romagna region.

The company said the roadmap to commercialization is structured in four phases. The first year will focus on optimizing the perovskite formulation and identifying stabilizing materials. In the second year, the company plans to begin small-scale production for certification purposes.

The third phase will center on developing an industrial solution for large-scale manufacturing, followed by the start of full-scale production in the fourth year. New Time said pilot-scale production with stabilized processes is expected within three years, with large-scale output targeted within four years.

To support the rollout, the company plans to allocate existing industrial facilities to the project, backed by dedicated internal investment. It said funding is already underway and is being sourced through reinvestment of company profits into innovation and research and development.

New Time said current pricing for perovskite PV modules remains influenced by the lack of optimized production processes and ongoing material selection. The project aims to improve cost competitiveness with existing PV technologies while maintaining strong potential for gains in performance and efficiency.

The Forlì meeting, held over two days starting March 31, focused on defining the operational phases of the project and establishing how expertise and technologies will be shared. Participants included researchers from Italy and the Netherlands, including representatives from the Italian National Research Council (CNR), the University of Bari Aldo Moro, and Delft University of Technology.

  •  

How much agrivoltaic shading is enough

Spanish researchers found that semi-transparent silicon PV greenhouses boosted tomato fruit weight by 25% while generating 726.8 kWh over two seasons, outperforming cadmium telluride PV and shaded controls. The PV-Si system balanced sunlight, temperature, and energy, showing strong agrivoltaic potential.

Researchers led by Spain’s Murcian Institute for Agricultural and Environmental Research and Development (IMIDA) have evaluated the impact of different agrivoltaic system designs on tomato crops to determine the level of shading that benefits the plants most.

“The use of four independent, identical greenhouses enables a robust assessment of their respective impacts on microclimate, crop performance, and energy generation,” the team said. “Specifically, the study aimed to evaluate the agronomic and energy performance of two commercially available semi-transparent PV technologies, with distinct light transmission patterns, in comparison with control and shading-net treatments.”

The researchers tested a semi-transparent monocrystalline silicon (PV-Si) greenhouse and a cadmium telluride thin-film (PV-TF) greenhouse against a control greenhouse and one with a shading net.

The study took place in Murcia, Spain, over two tomato-growing seasons: a 120-day winter-spring season from December 2023 to April 2024, and a 98-day spring-summer season from April to July 2024. Murcia’s semi-arid Mediterranean climate features average summer and winter temperatures of 30 C and 12 C, respectively. In both seasons, the team used polyethylene greenhouses measuring 3.9 m long × 2 m wide × 3.1 m high.

Materials under evaluation were installed on the roof and south façade of each greenhouse. The control greenhouse used only the standard polyethylene film, while the shading-control greenhouse added a shading net to selected zones. One solar greenhouse featured monofacial silicon PV modules with 50% transparency, and the other used cadmium telluride (CdTe) modules, also at 50% transparency. Each solar greenhouse had 18 modules—half on the roof, half on the façade—with nominal powers of 59 W for PV-Si and 40 W for PV-TF.

The microclimatic conditions inside each pilot greenhouse were monitored at two-minute intervals. Measurements included air temperature, relative humidity, solar irradiance, and photosynthetically active radiation,” the team explained. “Additionally, soil temperature and humidity were measured at five-minute intervals at depths ranging from 10 to 60 cm in 10 cm increments.”

The testing showed that the PV-Si technology generated an average daily energy output of 3.92 kWh in winter-spring and 4.07 kWh in spring-summer. PV-TF, meanwhile, produced 2.58 kWh and 2.79 kWh, respectively. Total energy generation across both seasons reached 726.8 kWh for PV-Si and 488.4 kWh for PV-TF.

Daily light integral (DLI), representing total photosynthetically active light received by plants each day, averaged 18.1 mol m⁻² in winter-spring and 25.4 mol m⁻² in spring-summer in the Si greenhouse. In the TF greenhouse, DLI averaged 10.8 mol m⁻² and 17 mol m⁻², respectively.

“During the winter-spring cycle, only the control and PV-Si greenhouses maintained DLI values above the minimum threshold required for optimal crop development,” the researchers reported. “Despite a similar number of fruits, the PV-Si greenhouse produced fruits with a mean weight 25% higher than the control, attributed to more favorable nighttime air temperatures and higher soil moisture.”

In winter-spring, the Si greenhouse yielded 21 fruits with an average weight of 74 g, while the TF greenhouse produced 18 fruits averaging 50 g. During spring-summer, the Si greenhouse produced 30 fruits averaging 93 g, compared with 23 fruits at 79 g in the TF greenhouse.

“Overall, the PV-Si system effectively balanced solar radiation management, thermal regulation, and energy production, demonstrating its potential as a suitable technology for agrivoltaic applications,” the team concluded.

The research findings were presented in “Comparative evaluation of semi-transparent monocrystalline silicon and cadmium telluride photovoltaics for tomato cultivation in Mediterranean agrivoltaic greenhouses,” published in Smart Agricultural Technology. Researchers from Spain’s IMIDA, Miguel Hernández University of Elche, and Italy’s University of Bari Aldo Moro have contributed to the study.

  •  

Agratas advances 20 GWh battery plant in western India

Agratas, the global battery business of the Tata Group, has completed the steel frame of its Sanand facility in India, with production expected to begin in 2027.

From pv magazine India

Agratas, the global battery business of the Tata Group, has completed the steel frame of its Sanand battery manufacturing facility in India, marking a key milestone toward operational readiness. The company expects production to commence in 2027.

The first phase of the project is designed for an annual production capacity of 20 GWh. Once operational, the facility will manufacture advanced battery cells for electric vehicles and energy storage applications.

Deepak Khare, vice president of manufacturing operations at Agratas, said completing the steel frame represents an important step in the company’s progress toward operational readiness. He said the focus now is on developing systems, processes, and capabilities to deliver batteries manufactured in India for global markets, while also building a skilled workforce to support safe and high-quality production.

The steel structure spans 700 meters in length and 150 meters in width, reaching a maximum height of 34 meters and covering a built-up area of 105,000 square meters. More than 24,000 tonnes of steel have been used in the main structure, while associated buildings are being developed in parallel. At peak construction, more than 2,500 skilled workers were active on-site simultaneously.

The project is being executed by Tata Projects Ltd. in collaboration with Tata Consulting Engineers and multiple steel contractors.

  •  

Bauer Solar launches 480 W back-contact solar module

The German manufacturer said its new back-contact solar panel has a power conversion efficiency of up to 23.52%.

From pv magazine Germany

German module manufacturer Bauer Solar is expanding its product portfolio with a new back-contact panel.

Initially, it will launch a full-black glass-glass version with an output of 480 W. It is built on 108 bifacial half-cells and measure 1,800 mm × 1,134 mm × 30 mm, with a listed weight of 24.8 kg. The module power conversion efficiency is 23.52%.

The company said that both the front and rear glass panes are 2 mm thick and feature anti-reflective coatings. The frame is made of anodized black aluminum alloy. The modules are rated for operating temperatures from –40 C to 85 C and a maximum system voltage of 1,500 V. They can reportedly withstand static loads up to 5,400 Pa and carry a hail resistance rating of HW3. Certifications include fire protection class A.

Bauer Solar is offering a 30-year product and performance warranty on the new modules. The linear performance warranty guarantees a minimum output of 88.85 % of the original capacity after 30 years. The company also plans to increase the output of its back-contact modules to 500 W later this year with the “Pure” and “Performance” variants.

Alongside back-contact modules, Bauer Solar will continue to focus on its TOPCon technology. This portfolio will be expanded this summer with the “Pure” and “Black” variants, which will reach an output of 465 W. While the company did not disclose pricing, it emphasized that the modules are aimed at the residential rooftop solar market as “an economical solution with an optimal price-performance ratio.”

  •  

New intrusion detection systems boost protection of SCADA systems against cyber threats

An international reserch team developed two deep learning-based IDS models to enhance cybersecurity in SCADA systems. The hybrid approach reportedly improves detection of complex and novel cyber threats with high accuracy, adaptability, and efficiency, outperforming traditional methods across multiple datasets.

A Saudi-British research team has develeped two new deep learning-based intrusion detection systems (IDSs) that can reportedly improve the cybersecurity of SCADA networks.

In large-scale solar power plants, SCADA systems play a vital role by overseeing energy generation, monitoring the performance of solar panels, optimizing output, identifying potential faults, and maintaining smooth overall operations. In essence, they act as the central system that converts raw solar data into practical control decisions, ensuring the plant operates safely, efficiently, and profitably.

The scientists explaind that current cybersecurity frameworks are often inadequate for SCADA systems because they cannot fully cope with the complexity and constantly evolving nature of modern cyber threats. Most existing approaches rely on signature-based detection, which depends on prior knowledge of attack patterns and therefore fails to detect zero-day exploits or novel intrusion techniques.

To address this limitation, the researchers considered deep learning methods, as these techniques allows to process large volumes of data, identify complex patterns, and enable more proactive threat detection.

“Such capability of handling and analyzing big data is particularly useful during scenarios when SCADA systems are generating huge streams of real-time data, including sensor readings, control commands, and other system logs,” they explained. “Furthermore, deep learning methods, especially convolutional neural networks (CNNs) and recurrent neural networks (RNNs), have shown outstanding performances in the detection of complex attack scenarios with sequential or spatial patterns in data.”

 Do you want to strengthen and enhance the cyber security of your solar energy assets to safeguard them against emerging threats?

Join us on Apr. 29 for pv magazine Webinar+ | Decoding the first massive cyberattack on Europe’s solar energy infrastructure – The Poland case and lessons learned

Industry experts will explore real-world cyberattack scenarios, highlight potential vulnerabilities in solar and storage systems, and share practical, actionable strategies to protect your energy assets. Attendees will gain valuable knowledge on how to anticipate, prevent, and respond to cyber threats in the rapidly evolving solar energy sector.

The proposed approach integrates two new IDSs, named the Spike Encoding Adaptive Regulation Kernel (SPARK) and the Scented Alpine Descent (SAD) algorithm. By leveraging their complementary strengths, the method reportedly improves spike-threshold accuracy while enhancing adaptability and robustness under dynamic conditions.

The SPARK model introduces adaptive spike encoding by dynamically adjusting thresholds based on input signal characteristics. It uses advanced statistical methods to respond to variations in neural input, improving sensitivity to changes in intensity and frequency. By integrating both temporal and spatial features, SPARK enhances information encoding, especially for complex datasets. Unlike traditional fixed-threshold methods, it provides context-aware thresholding, improving accuracy and reliability.

The SAD algorithm complements SPARK by offering an optimization strategy inspired by olfactory navigation, which is the process by which animals and organisms use odor cues to locate food, mates, or home, and Lévy flight behavior, which is a strategy obeserved in many animal species to randomly search for a target in an unknown environment. This purportedly enables efficient exploration of solution spaces and avoids local minima, ensuring optimal threshold selection.

The hybrid approach can dynamically adjust and optimize spike thresholds simultaneously, surpassing conventional static or isolated approaches, according to scientists, which noted that the SPARK model is well-suited for SCADA and IoT systems due to its scalability, real-time adaptability, and efficient data handling. Additionally, its lightweight design reduces computational overhead and false positives, making it effective for resource-constrained environments.

“SAD is complementary to SPARK in the sense that it focuses on improving the detection accuracy while maintaining computational efficiency,” the researchers emphasized. “SAD's anomaly scoring mechanism can be integrated into this framework to add another layer of detection, which can run parallel with SPARK. In effect, integrating the deep learning models into the scoring mechanism means that SAD would enable a much more fine-grained analysis of attack patterns with little noticeable impact on performance for the SCADA system in question.”

The researchers used multiple benchmark datasets are used to evaluate SCADA intrusion detection performance, including the Secure Water Treatment (SWaT) testbed, Gas Pipeline, WUSTL-IIoT, and Electra. These datasets capture diverse industrial environments, attack types, and operational conditions, enabling comprehensive testing. They also include time-series sensor data, actuator commands, and labeled attack scenarios such as denial-of-service (DoS), distributed denial-of-service (DDoS), malware, and injection attacks.

The diversity of datasets ensured accurate modeling of both normal behavior and complex anomalies in SCADA and IIoT systems, according to the research team. Standardized preprocessing, training, and evaluation procedures also enabled comparison across all tested models. Cross-validation and controlled training conditions, meanwhile, reportedly prevented bias and ensured reliable generalization results. Visualization tools such as histograms, loss curves, and confusion matrices provided insights into model behavior and anomaly detection.

The SPARK model was found to consistently demonstrate “superior” performance, achieving high accuracy, precision, and recall across datasets. It outperformed traditional machine learning and deep learning approaches in detecting diverse intrusion types.

“The findings underline, in summary, that the SPARK and SAD models are basically the final frontier in modern intrusion detection,” the scientists said. “Distinctly designed to provide improved detection capabilities and operational efficiency, the two designs also chart a way into more resilient and intelligent security solutions for modern industrial controlled systems (ICSs) and Internet-of-Things (IoT) networks.”

The novel IDSs have been presented in “SPARK and SAD: Leading-edge deep learning frameworks for robust and effective intrusion detection in SCADA systems,” published in the International Journal of Critical Infrastructure Protection. The research team comprised academics form the Leeds Beckett University in the United Kingdom and King Abdulaziz University in Saudi Arabia. 

  •  

Copper, indium, selenium micro-islands pave the way for next-gen micro-concentrator solar cells

A German research team has developed CuInSe₂ micro-concentrator solar cells using laser-assisted metal-organic chemical vapor deposition to grow indium islands directly on molybdenum-coated glass, forming absorber arrays without masks or patterning. The not-yet-optimized micro-modules achieved up to 0.65% efficiency under one sun, with gains of up to 250% under concentrated illumination.

A research team in Germany has developed a copper, indium, selenium (CuInSe₂) micro-concentrator solar device composed of vertically grown absorber islands on a molybdenum (Mo) films.

The scientists used laser-assisted metal-organic chemical vapor deposition (LA-MOCVD) to grow indium (In) islands in a bottom-up approach, instead of depositing a continuous thin film and subsequently patterning it. “The primary novelty of our work is the use of a LA-MOCVD method for the bottom-up growth of indium precursor islands,” corresponding author Jan Berger told pv magazine. “This approach proved to be a fast and reliable technique for simultaneous local growth, importantly offering the possibility to add gallium and copper locally using the same method.”

“The most unexpected finding was that the indium precursor islands formed distinct cluster structures that remained pinned in place, refusing to coalesce into a single large island – even after annealing above the melting temperature of indium,” he added. “Furthermore, it was surprising to see that the structural features of these precursor islands remained clearly visible even after the selenization process.”

Device fabrication begins with glass substrates coated with Mo, which are then processed by LA-MOCVD. In this step, a laser array locally heats the substrate. It decomposes the precursor gas only at defined spots, forming a 7 × 7 array of indium islands without the need for masks or patterning. A thin copper layer is subsequently deposited, and the stack is selenized to form CuInSe₂ absorber islands.

Parameters of the micro-modules as a function of light concentration

Image: Universität Duisburg-Essen (UDE), Solar Energy Materials and Solar Cells, CC BY 4.0

Afterward, the samples are etched to remove unwanted material, coated with photoresist for electrical isolation, and patterned with a laser to form openings. The solar cell is then completed by depositing a cadmium sulfide (CdS) buffer layer, followed by intrinsic zinc oxide (i-ZnO) and aluminum-doped zinc oxide (AZO) window layers. Finally, each array of 49 micro-cells is contacted and measured as a single module, with a device structure of glass/Mo/CIS absorber/ cadmium sulfide (CdS)/i-ZnO/AZO.

Overall, the team produced nine micro-modules and tested four of them. Initial measurements were conducted under one sun, followed by increasing intensities up to 17 suns to simulate concentrator conditions. These not-yet-optimized arrays achieved a conversion efficiency of up to 0.65% under one sun, with efficiency rising under higher illumination—gains of around 60% at lower concentrations and up to 250% at 17 suns.

“Functional devices were successfully produced, but notable key challenges were identified, particularly related to the intensity distribution of diffractive optical element (DOE), the initial morphology of indium islands, and process repeatability. Addressing these challenges in terms of material quality and process control is essential,” the team explained. “Once resolved, the LA-MOCVD method holds significant promise as a rapid and resource-efficient production technique for next-generation micro-concentrator photovoltaics.”

The new cell concept was presented in “CuInSe2-based micro-concentrator solar cells fabricated from In islands grown by laser-assisted MO-CVD,” published in Solar Energy Materials and Solar Cells. Scientists from Germany's University of Duisburg-Essen, the Leibniz Institute for Crystal Growth, the Federal Institute for Materials Research and Testing, Brandenburg University of Technology Cottbus-Senftenberg, and the engineering company Bestec have participated in the study.

  •  

Spin-flip emitters could control energy pathways in singlet fission solar cells

Japanese researchers developed a molybdenum-based spin-flip emitter that efficiently harvests triplet excitons from singlet-fission tetracene dimers, producing strong near-infrared emission. This approach could boost solar cell efficiency and enable new quantum technologies by converting otherwise “dark” excitons into usable light.

A research team at Kyushu University in Japan has reported a breakthrough that could steer photovoltaic technology past long‑standing efficiency barriers by harnessing a quantum process known as singlet fission (SF).

Singlet exciton fission is an effect seen in certain materials whereby a single photon can generate two electron-hole pairs as it is absorbed into a solar cell rather than the usual one. The effect has been observed by scientists as far back as the 1970s and though it has become an important area of research for some of the world’s leading institutes over the past decade; translating the effect into a viable solar cell has proved complex.

Singlet fission solar cells can produce two electrons from one photon, making the cell more efficient. This happens through a quantum mechanical process where one singlet exciton (an electron-hole pair) is split into two triplet excitons. By pairing SF with a specially designed spin‑flip molybdenum‑based complex, the scientists demonstrated energy conversion and harvesting in solution with an effective quantum yield of around 130%.

“The applications of this work in solar cells will require integrating singlet-fission (SF) materials with spin-flip emitters in solid-state systems,” Nobuo Kimizuka, lead author of the study, told pv magazine. “As fundamental research, our first step is to develop high-efficiency SF and spin-flip emitters with well-controlled energy levels and luminescence quantum yields in solid-state environments, and then evaluate the performance of these integrated systems.”

“We are actively working on building a higher-performance solid-state system,” he added. “Achieving robust performance in solid-state solar cells remains a challenge, but we expect the efficiency to surpass that of conventional SF technology alone. This approach, which multiplies photons and converts otherwise ‘dark’ triplet excitons into light, could open the door to new quantum technologies such as quantum sensors and exciton circuits, while also contributing to the design of next-generation quantum materials.”

The team developed a molybdenum-based spin-flip emitter that selectively captures the energy of triplet excitons before they dissipate. Its molecular design allows electron spin to flip during near-infrared (NIR) light absorption or emission, enabling more efficient harvesting of the multiple excitons generated by singlet fission.

Further analysis showed that sensitization efficiency depends heavily on the structure of the linker connecting tetracene units. The linker dictates not only the spatial arrangement and electronic coupling of the chromophores but also the exchange interaction within the correlated triplet pair. Variations in linker length, rigidity, and conjugation can significantly affect the rate and yield of triplet energy transfer to the spin-flip emitter, influencing both efficiency and the dynamics of the singlet fission process.

“The methodology we developed for assessing doublet yields provides a practical way to estimate triplet yields of SF dimers, even in systems with complex energy-transfer pathways involving both correlated and free triplets,” Kimizuka explained. “Reducing losses from correlated triplet-pair recombination requires either rapid separation into long-lived multiexcitons or faster triplet transfer to an acceptor molecule, achievable through careful energy-level design in oligomers or solid-state structures.”

“With a versatile selection of central metals, including chromium, molybdenum, and vanadium, and tunable ligands informed by Tanabe–Sugano diagrams and ligand-field theory, spin-flip emitters show strong potential as NIR-emitting materials for efficient triplet extraction, especially with recent advances in air-stable designs,” he added.

The interface design will be critical for converting triplet excitons generated by tetracene singlet fission into charge carriers on the silicon solar cell surface. “In SF-sensitized silicon cells, one major source of energy loss is transfer from the SF molecule to silicon via its excited singlet state,” Kimizuka noted. “Our proof-of-concept method blocks these loss pathways, enabling selective extraction of the excited triplet states originating from singlet fission.”

The research findings are available in the study “Exploring Spin-State Selective Harvesting Pathways from Singlet Fission Dimers to a Near-Infrared-Emissive Spin-Flip Emitter,” published in the Journal of the Chemical American Society.

  •  

Swiss startup offers lifetime guarantee for second-life batteries

Switzerland startup Evolium Technologies’ subscription-based business model offers residential battery owners a lifetime guarantee on second life batteries. The startup tests and remotely monitors each battery cell so it can alert customers when a cell is under-performing.

From ESS News

Established in 2024 and backed by the Swiss Innovation Association, Evolium Technologies is a Swiss second-life battery startup with its own unique approach to battery recycling. It’s a modular approach, as Evolium’s founder and CEO, Alexandre Staub, told ESS News.

Evolium runs a subscription-based module exchange program, whereby (mostly) residential customers can exchange used and old modules for fully functional second-life modules. All cells used in its batteries are tested in-house by the company, which Staub claimed is another USP as cell testing is an area where a lot of second-life battery providers tend to struggle, he said.

“Most of our team are robotic experts, and they develop robots more than they develop batteries,” he said, explaining that the team develops robots to test the cells at scale. “The robots are fairly cheap, and they are able to execute this task of testing the cells and assessing which cell can go into a second life and which cell cannot.”

Evolium mostly works with INR18650 cylindrical cells and once these cells have been approved by testing, they can be reassembled into second-life batteries.

To continue reading, please visit our ESS News website.

 

  •  

TNO unveils 12.4%-efficient perovksite solar tile

The Dutch research institute has presented what it describes as the world’s first perovskite-based roof tile, achieving up to 13.8% efficiency on standalone modules and 12.4% when installed on a curved surface. The flexible modules were produced using TNO’s experimental roll-to-roll platform,

The Netherlands Organization for Applied Scientific Research (TNO) has unveiled today a building-integrated photovoltaic (BIPV) tile based on perovskite solar cell technology.

The new product is billed as the world's first perovskite solar tile.

“This demonstrator is supported by the Province of North Brabant through the project ‘Solar manufacturing industry to Brabant, Solliance 2.0’. Additional funding was received from the European Union’s Horizon Europe programme for the Luminosity project,” TNO said in a statement. “The work was also partly funded by the National Growth Fund programme SolarNL.”

The Dutch research institute partnered with Netherlands-based BIPV specialist Asat BV in deploying 10 cm x 10 cm perovskite solar modules built on flexible foil onto a curved composite roof tile. Testing indicates that bending the modules to fit the curved surface has minimal impact on their performance.

Standalone modules reached energy conversion efficiencies of up to 13.8%, while the modules retained an efficiency of 12.4% after installation on the curved roof tile.

The experimental production line used to encapsulate the solar tiles

Image: TNO

The perovksite modules were encapsulated with an experimental roll-to-roll manufacturing platform developed by TNO itself. Roll-to-roll manufacturing – similar to the process used in newspaper printing – enables continuous production of solar cells on long rolls of flexible material. The technique is widely seen as a potential pathway to lower production costs and high-volume manufacturing for emerging thin-film technologies such as perovskites.

More technical details about the solar tile were not disclosed. TNO said it will be commercialized by its spinoff Perovion Technologies, which was launched last month. 

TNO's recent research on perovskite solar cells, includes developing roll-to-roll and spatial atomic layer deposition (SALD) processes for the deposition of functional materials, solar cell layers, and flexible foils.

In July, Solarge, a manufacturer of lightweight silicon PV modules based in the Netherlands, and TNO unveiled a 32 cm x 34 cm lightweight prototype perovskite solar panel.

A month earlier, Japan’s Sekisui Solar Film, part of Sekisui Chemical, the Brabant Development Agency (BOM), which serves the Dutch province of Noord-Brabant, and TNO signed a letter of intent in Osaka, Japan to explore collaboration related to flexible perovskite solar PV module technologies.

As pv magazine has reported, Sekisui Solar Film is developing technology for lightweight, flexible perovskite solar module manufacturing using an advanced roll-to-roll process. It is working on a 100 MW plant in Japan for large-scale production, is undertaking field demonstrations, and signed a perovskite solar-related memorandum of understanding with Slovakia.

 

  •  

The impact of microclimate effects on floating PV plants

French researchers have developed a high-resolution computational framework to model microclimate effects of large floating solar PV systems, enabling accurate predictions of heat transfer, ambient temperatures, and water evaporation based on panel configuration and wind conditions. The model can inform thermal performance, environmental impacts, and optimize designs for utility-scale floating PV, as well as ground-mounted and agrivoltaic installations.

French researchers have developed a framework to model microclimate effects of large-sized floating PV systems.

The new model can be used to determine wind-dependent convective heat transfer coefficients (CHTC), ambient temperatures, and to estimate evaporation patterns in partially covered bodies of water based on a variety tilt angles, module heights, and pitch distances.

“The main novelty of this work lies in the numerical methodology we developed, specifically an upscaling method to quantify panel-atmosphere interactions at the module scale then model the micrometeorology at the power plant scale with a relatively fine resolution of about 4 meters,” Baptiste Amiot, corresponding author of the research told pv magazine, adding that the resolution is significantly higher than others in this field.

“Applying this methodology enables us to map the thermal performance across utility-scale installations and to provide insights into local environmental effects, such as evaporative losses,” he said.

The precursor model is geometrically adaptable: tt can handle various tilt angles, mounting heights, and inter-row spacings, according to Amiot. “It is particularly well-suited for large-scale installations exposed to sufficiently windy conditions,” Amiot added.

The researchers used a computational fluid dynamics (CFD) precursor model, a microclimate CFD model supporting the PV parameterization, and an experimental survey. A wind-tunnel setup typical of a land-based application was used to confirm accuracy of altitude-based wind profiles.

In addition, a geometrical layout of a commercial floating PV (FPV) installation was used for the atmosphere boundary layer parameters. The wind direction effects were assessed using the microclimate CFD model that reproduced the localized conditions of the commercial FPV array.

“The atmospheric component is fundamentally similar to regional climate models (RCMs) but deploying it within a CFD framework offers advantages in terms of surface element parameterization and the spatial discretization we can achieve,” said Amiot.

Some of the findings included temperature gradients range between 1.3 C/km and 5.8 C/km; headwinds and tailwinds relative to the front surface of the PV modules generate the greatest turbulence levels. Furthermore, the team was able investigate how turbulent flows influence water-saving gains based on PV coverage of the water surface.

Assessing the results, the researchers noted that the precursor method “readily determines” heat transfer coefficient correlations as a function of wind speed and direction. “This is essential to obtain the thermal U-values that govern panel cooling,” added Amiot.

The model can be extended to model large ground-mounted systems and agrivoltaics, including dynamic configurations where panels adjust orientation throughout the day, according to Amiot. It is suitable for inland and nearshore FPV, but not offshore FPV.

The work is detailed in “Boundary-layer parameterization for assessing temperature and evaporation in floating photovoltaics at the utility-scale,” published in Renewable Energy. Research participants include Ecole nationale des ponts et chaussees, Electricité de France RD, and Université Claude Bernard.

The researchers are currently focused on developing CFD models to predict both the energy output and environmental trade-offs of dual-use photovoltaics systems and FPV evaporation research at finer spatial scales, coupled with in-situ measurements. It is also working on an agrivoltaics CFD-plant model to predict crop response below PV canopies.

  •  

Solar-plus-storage for data centers: not a simple switch

Renewables and storage could reliably power data centers, but success requires active grids, coordinated planning, and the right mix of technologies. Hitachi Energy CTO, Gerhard Salge, tells pv magazine that holistic approaches ensure technical feasibility, economic viability, and energy system resilience.

As data centers grow in size and complexity, supplying them with cheap and reliable power has never been more pressing. Gerhard Salge, chief technology officer (CTO) at Hitachi Energy, a unit of Japanese conglomerate Hitachi, shed light on the relationship between renewable energy and data center operations, noting that while technically feasible, success requires careful planning, the right infrastructure, and a holistic approach.

“When we look at what's happening in the grids, then renewables are an active element on the power generation side, and the data centers are an active element on the demand side,” Salge told pv magazine. “What you need in addition to that is in the dimensions of flexibility, for which we need storage and a grid that can actively act also here in order to bring all these elements together.”

Want to learn more about matching renewables with data center demand?

Join us on April 22 for the 3rd SunRise Arabia Clean Energy Conference in Riyadh.

The event will spotlight how solar and energy storage solutions are driving sustainable and reliable infrastructure, with a particular focus on powering the country’s rapidly growing data center sector.

According to Salge, the key is active grids, not passive systems that simply react to conditions. With more renewables, changing demand patterns, new load centers, and storage options like batteries and existing facilities such as pumped hydro, it is crucial to coordinate these resources actively to maintain supply security, power quality, and cost optimization.

“But when you talk about the impact and the correlation between renewables and data centers, you need always to consider this full scope of the flexibility in a power system of all the elements—demand side, generation side, storage side, and the active grid in between,” he said, noting that weak or congested grids would not serve this purpose.

AI data centers

Salge warned that not all data centers are the same. “There are conventional data centers and AI data centers,” he said. “Conventional data centers are essentially high-load systems with some fluctuations on top. They contain many processors handling requests—from search engines or other applications—so the workload is distributed stochastically across them. This creates a baseline load with random ups and downs, which is the typical load pattern of a conventional data center.”

AI workloads, in contrast, rely heavily on GPUs or AI accelerators, which consume significant power continuously. Unlike conventional data centers, AI data centers often run at sustained high load, sometimes close to maximum capacity for long periods.

Htitachi Energy CTO Gerhard Salge

Image: Hitachi Energy

“AI data centers are specifically good in doing parallel computing,” Salge explained. “So many of them are triggered with the same demand pattern at the same time, which creates these spikes up and down in the demand profile, and they come in parallel all together.”

These fluctuations challenge both the power supply and the voltage and frequency quality of the connected grid. “So, you need to transport active power from an energy storage system or a supercapacitor to the demand of the AI data center. And that then needs to involve really the control of the data center’s active power. What you need is the interaction between the storage unit and then the AI data center to provide active power or to absorb it afterwards when the peak goes down. That can be also done by a supercapacitor.”

Batteries can store much more energy than supercapacitors, but the latter can ramp smaller energies more frequently. “However, if you put a battery that is smaller than the load, and you really need to cycle the battery through its full capacity, the battery will not survive very long with your data center, because the frequency of these bursts is so high, then you are aging the battery very, very quickly, yeah, so supercapacitors can do more cycles,” Salge emphasized.

He also noted that batteries and supercapacitors are both mature technologies, but the optimal setup—whether one, the other, or a combination with traditional capacitors—depends on storage size, number of racks, voltage levels, and overall system design.

Managing AI training bursts

Salge stressed the importance of complying with grid codes across geographies. “You need to become a good citizen to the power system,” he said. “You have to collaborate with local utilities to make sure that you are not infringing the grid codes and you are not disturbing with the data center back into the grid. A good way to do this, when renewables and data centers are co-located, is to manage renewable energy supply already inside the data center territory. Moreover, having a future-fit developed grid is a clear advantage. Because you have much more of these flexibility elements and the active elements to manage storage and renewable integration and to manage the dynamic loads of the data centers.”

If the grid is not future-fit with modern, actively operating equipment, operators will see significantly more stress. “With holistic planning, instead, you can even use some of the data center flexibility as a controllable and demand response kind of feature,” Salge said, adding that data center operators could coordinate AI training bursts to periods when the power system has more available capacity. This makes the data center a predictable, controllable demand, stressing the grid only when it is prepared.

“In conclusion, regarding technical feasibility: yes, it’s possible, but it requires the right configuration,” Salge said.

Economic feasibility

On economics, Salge believes solar and wind remain the cheapest power sources, even when accounting for the grid flexibility needed to integrate them with data centers. Solar is fastest to deploy, wind complements it well, and both can be scaled in parallel.

“Any increase in data center demand requires investment, whether from renewables or conventional power. Economics depend on the market, and market mechanisms, regulations, and technical grid planning are interconnected, influencing energy flow, pricing, and system stability,” he said.

“We recommend developers to work with all stakeholders—utilities, technology providers, and planners—from the start to ensure reliability, affordability, and social acceptance. Holistic planning avoids reactive fixes and leads to better long-term outcomes,” Salge concluded.

  •  

Reducing PV module temperature with leaf vein–inspired fins

Researchers in Iraq have developed biomimetic leaf vein–inspired fins for photovoltaic panels, with reticulate (RET) venation reducing panel temperature by 33.6 C and boosting efficiency by 18% using passive cooling. Their study combines 3D CFD simulations and electrical evaluations to optimize fin geometry, offering a sustainable alternative to conventional cooling methods.

A research group from Iraq’s Al-Furat Al-Awsat Technical University has numerically investigated the thermal and electrical performance of PV panels integrated with leaf vein–inspired fins. They have simulated four types of venation used by plants, namely pinnate venation (PIN), reticulate venation (RET), parallel venation along the vertical axis (PAR-I), and parallel venation along the horizontal axis (PAR-II).

“The key novelty of our research lies in introducing and systematically optimizing biomimetic leaf vein–inspired fin geometries as passive heat sinks for photovoltaic panels,” corresponding author Yasser A. Jebbar told pv magazine. “While conventional cooling approaches rely on simple straight fins, fluids, or active systems, our study is among the first to directly translate natural leaf venation patterns—particularly RET structures—into manufacturable backside fins specifically tailored for PV thermal and electrical performance.”

The team combined detailed 3D computational fluid dynamics (CFD) modeling with electrical efficiency analysis to identify geometries that maximize heat dissipation without additional energy input or water consumption. Next steps include experimental validation of the leaf vein fin designs under real outdoor conditions, particularly in hot climates.

The simulated PV panel consisted of five layers: glass, two ethylene-vinyl acetate (EVA) layers, a solar cell layer, and a Tedlar layer, with a copper heat sink and fins attached. All fin configurations were initially 0.002 m thick, 0.03 m high, and spaced 0.05 m apart. Panels measured 0.5 m × 0.5 m, with a surrounding air velocity of 1.5 m/s and incident irradiance of 1,000 W/m².

RET fins outperformed all other designs, reducing operating temperature by 33.6 C and increasing electrical efficiency from 12.0% to 14.19% —an 18 % relative improvement—compared to uncooled panels.

“This temperature reduction rivals, and in some cases exceeds, water-based or hybrid cooling methods, despite relying solely on passive air cooling,” Jebbar noted. The study also highlighted the significant impact of fin height, more than spacing or thickness, on cooling performance.

The team further optimized the RET fins, varying spacing from 0.02–0.07 m, height from 0.02–0.07 m, and thickness from 0.002–0.007 m. The optimal geometry—0.03 m spacing, 0.05 m height, and 0.006 m thickness—achieved the maximum 33.6 C temperature reduction and 18% efficiency gain.

The novel cooling technique was described in “Improving Thermal and Electrical Performance of PV Panels Using Leaf Vein Fins,” published in Solar Energy. Researchers from Iraq’s Al-Furat Al-Awsat Technical University, University of Kerbala, and Sweden’s University of Gävle have participated in the study.

  •  

UNSW researchers identify new damp heat-induced failure mechanism in TOPCon solar modules

UNSW researchers identified a new damp-heat degradation mechanism in TOPCon modules with laser-fired contacts, driven primarily by rear-side recombination and open-circuit voltage loss rather than series-resistance increase. The study highlights that magnesium in white EVA encapsulants accelerates degradation, guiding improved encapsulant and backsheet selection for more reliable modules in humid environments.

A research team from the University of New South Wales (UNSW) has identifed a new damp heat-induced degradation pathway in TOPCon modules fabricated with laser-assisted fired contacts.

“Unlike earlier studies dominated by series-resistance increase, the primary degradation driver here is a reduction in open-circuit voltage, linked to enhanced rear-side recombination,” the research's lead author, Bram Hoex, told pv magazine. “The new degradation mechanism emerged under extended damp-heat (DH) exposure.”

The scientists conducted their analysis on 182 mm × 182 mm TOPCon cells fabricated in 2024 with laser-assisted firing.

The TOPCon solar cells employed a boron-doped p⁺ emitter, along with a front-side passivation stack consisting of unintentionally grown silicon dioxide (SiOₓ), aluminium oxide (Al₂O₃), and hydrogenated silicon nitride (SiNₓ:H), capped with a screen-printed H-pattern silver (Ag) contact grid. On the rear side, the structure comprised a SiO₂/phosphorus-doped n⁺ polycrystalline silicon/SiNₓ:H stack, also contacted by a screen-printed H-pattern Ag grid.

The researchers encapsulated the cells with different bill of materials (BOMs): two types of ethylene vinyl acetate (EVA); two types of polyolefin elastomer (POE); and one type of EVA-POE-EVA (EPE). They also used commercial coated polyethylene terephthalate (PET) composite (CPC) backsheets.

“The mini modules were laminated at 153 C for 8 min under standard industrial lamination conditions,” the academics explained. “All modules underwent DH test at 85 C and 85% relative humidity (RH) in an ASLi climate chamber for up to 2,000 h to study humidity-induced failures.

Schematic of the TOPCon solar cells and modules

Image: UNSW, Solar Energy Materials and Solar Cells, CC BY 4.0

The tests showed that maximum power losses ranged from 6% to 16%, with the difference among these values depending strongly on the encapsulation BOM.

“The modules with POE on both sides were the most stable at around 8%, while those using white EVA on the rear side, especially in combination with EPE, showed the largest losses at around 16%,” said Hoex. “The primary driver of the degradation was a reduction in open-circuit voltage rather than the increased series resistance after DH testing, which diverges from previous findings that predominantly attributed DH-induced degradation to metallisation corrosion.”

The research team explained that higher levels of degradation were attributable to additives containing magnesium (Mg) in white EVA, which migrate under DH, hydrate, and create an alkaline micro-environment. “This alkaline chemistry corrodes the rear SiNx passivation layer, increases interfacial hydrogen concentration, induces local pinhole-like defects, and raises dark saturation current, ultimately reducing open-circuit voltage,” Hoex emphasized.

The scientists also explained that, although Mg in white EVA encapsulants and its role in acetic acid–induced degradation was previously reported, the effect of MgO on performance degradation in TOPCon modules was not explicitly studied.

Their findings are available in the paper “A novel damp heat-induced failure mechanism in PV modules (with case study in TOPCon),”  published in Solar Energy Materials and Solar Cells.

“We hope this work helps refine encapsulant and BOM selection strategies for next-generation TOPCon modules, particularly for humid-climate deployment,” Hoex concluded. “It provides clear guidance for controlling Mg content in rear encapsulants and optimising rear-side passivation robustness. The mechanistic insights from this study have already informed upstream design changes, substantially reducing risk in commercial modules.”

Other research by UNSW showed the impact of POE encapsulants in TOPCon module corrosion, soldering flux on TOPCon solar cell performancedegradation mechanisms of industrial TOPCon solar modules encapsulated with ethylene vinyl acetate (EVA) under accelerated damp-heat conditions, as well as the vulnerability of TOPCon solar cells to contact corrosion and three types of TOPCon solar module failures that were never detected in PERC panels.

Furthermore, UNSW scientists investigated sodium-induced degradation of TOPCon solar cells under damp-heat exposure, the role of ‘hidden contaminants’ in the degradation of both TOPCon and heterojunction devices, and the impact of electron irradiation on PERC, TOPCon solar cell performance.

More recently, another UNSW rsearch team developed an experimentally validated model linking UV-induced degradation in TOPCon solar cells to hydrogen transport, charge trapping, and permanent structural changes in the passivation stack.

  •  

TheStorage launches its first industrial-scale sand-based heat storage system

The Finnish start-up says its sand battery technology is scalable from 20 to 500 MWh with charging power from 1 to 20 MW, depending on industrial needs.

From ESS News

Finnish cleantech startup TheStorage says that its thermal storage technology could reduce industrial energy costs by up to 70% and cut carbon emissions by as much as 90%. The system converts renewable electricity into heat, stores it in sand, and delivers it on-demand for industrial heating.

The concept emerged in Finland in 2023, with engineering work beginning in 2024. In January 2026, TheStorage installed its first industrial-scale pilot at a brewery, putting the technology to the test in a real-world setting. There, it produces fossil-free steam for the brewery’s production lines.

“Producing steam without fossil fuels is a major step toward carbon-neutral production,” says Vesa Peltola, Production Director of the brewery.

TheStorage’s technology captures electricity when it is abundant and inexpensive, converts it into high-temperature heat, and stores it in sand. This stored heat can later be used in industrial processes independently of real-time electricity availability.

To continue reading, please visit our ESS News website.

  •  

Agrivoltaics can help lettuce survive extreme heat

Scientists have grown organic romaine lettuce under 13 different types of PV modules, in an unusual hot Canadian summer. Their analysis showed lettuce yields increased by over 400% compared to unshaded control plants.

A research group from Canada’s Western University has investigated the performance of organic romaine lettuce, a heat-sensitive crop, under a broad range of agrivoltaic conditions. The test was conducted in London, Ontario, in the summer of 2025, during which 18 days had temperatures over 30 C.

“Our study explores how agrivoltaic systems can be tailored to optimize crop growth, especially under extreme heat conditions, while contributing to sustainable energy generation,” corresponding researcher Uzair Jamil told pv magazine.

“This becomes especially relevant in the context of climate change, where we are experiencing temperature extremes across the world,” Jamil added. “We examined the performance of organic romaine lettuce under thirteen different agrivoltaic configurations – ranging from crystalline silicon PV to thin-film-colored modules (red, blue, green) – in outdoor, high-temperature stress conditions.”

More specifically, the experiment included c-Si modules with 8%, 44% and 69% transparency rate; blue c-Si modules with transparency of 60%, 70%, and 80%; green c-Si modules with transparency of 60%, 70%, and 80%; and red c-Si modules with transparency of of 40%, 50%, 70%, and 80%.

All agrivoltaics installations had a leading-edge height of 2.0 m and a trailing-edge height of 2.8 m, and the modules were oriented southwards at 34◦. Pots with organic romaine lettuce were placed under all configurations, along with three pots fully exposed to ambient sunlight without shading, used as controls.

In addition to measurements against the control, the scientific group has compared the results to the national average per-pot yield for 2022, which included less high-temperature days and was therefore considered typical. Those data points were taken from agricultural census data, which later enabled the researcher also to create nationwide projections of their results.

“Lettuce yields increased by over 400% compared to unshaded control plants, and 200% relative to national average yields,” Jamil said about the results. “60% transparent blue Cd-Te and 44% transparent crystalline silicon PV modules delivered the highest productivity gains, demonstrating the importance of both shading intensity and spectral quality in boosting plant growth.”

Jamil further added that if agrivoltaic were to scale up to protect Canada’s entire lettuce crop, they could add 392,000 tonnes of lettuce.

“That translates into CAD $62.9 billion (USD $46.6 billion) in revenue over 25 years,” he said. “If scaled across Canada, agrivoltaics could also reduce 6.4 million tonnes of CO2 emissions over 25 years, making it a key player in reducing the agricultural sector’s environmental footprint.”

The results of the research work were presented in “Enhancing heat stress tolerance in organic romaine lettuce using crystalline silicon and red, blue & green-colored thin film agrivoltaic systems,” published in Solar Energy.

  •  

Cubenergy releases energy storage block for utility, C&I applications

Cubenergy has launched FlexCombo 2.0, a scalable battery energy storage system for utility, commercial, and industrial applications, offering up to 16 MWh capacity with LFP batteries. Its modular design, advanced BMS, and cloud-based operations enable easy installation, seamless expansion, and efficient grid integration, according to the manufacturer.

Cubenergy, a Chinese manufacturer of battery energy storage systems (BESS), has introduced a new energy block designed for utility, commercial, and industrial (C&I) applications.

The product, named FlexCombo 2.0, uses the company’s 835 kWh FlexCombo D2 batteries. It is available in three configurations: 10, 12, or 12 batteries, providing a total capacity of 8 MWh, 10 MWh, or 16 MWh, respectively.

“With the FlexCombo D2 modular design and parallel architecture, FlexCombo’s core advantage lies in its long-term scalability,” the company said in a statement. “It enables seamless capacity growth and effortless integration with power generation systems (PGS), simplifying deployment and accelerating delivery for ultimate flexibility.”

The FlexCombo D2 batteries feature lithium iron phosphate (LFP) chemistry, offering a lifespan of 8,000 cycles at 70% capacity retention, according to the manufacturer.

Each battery measures 2 m x 1.68 m x 2.55 m and has a weight of up to eight tons. They carry an IP55 protection rating. Each block also comes with a power conversion system (PCS) rated at 430 kW AC with an IP66 protection grade. Optional medium-voltage (MV) transformers are available, with AC power ratings of either 8,800 kVA or 5,250 kVA.

“The FlexCombo 2.0 is designed primarily for utility and C&I applications, including renewable energy arbitrage, stand-alone grid stabilization, factories, and commercial buildings,” the company stated. “This integrated, easy-to-install BESS can be quickly connected and aligned with project requirements, while the advanced Active Balancing battery management system (BMS) and cloud-based operations provide a superior user experience.”

  •  

Study finds much lower-than-expected degradation in 1980s and 1990s solar modules

Researchers at SUPSI found that six Swiss PV systems installed in the late 1980s and early 1990s show exceptionally low degradation rates of just 0.16% to 0.24% per year after more than 30 years of operation. The study shows that thermal stress, ventilation, and material design play a greater role in long-term module reliability than altitude or irradiance alone.

A research group led by Switzerland's University of Applied Sciences (SUPSI) has carried out a long-term analysis of six south-facing, grid-connected PV systems installed in Switzerland in the late 1980s and early 1990s. The researchers found that the systems’ annual power loss rates averaged 0.16% to 0.24%, significantly lower than the 0.75% to 1% per year commonly reported in the literature.

The study examined four low-altitude rooftop systems located in Möhlin (310m-VR-AM55), Tiergarten East and West in Burgdorf (533m-VR-SM55(HO)), and Burgdorf Fink (552m-BA-SM55). These installations use ventilated or building-applied rooftop configurations. The analysis also included a mid-altitude utility-scale plant in Mont-Soleil (1270m-OR-SM55) and two high-altitude, facade-mounted systems in Birg (2677m-VF-AM55) and Jungfraujoch (3462m-VF-SM75).

All systems are equipped with either ARCO AM55 modules manufactured by US-based Arco Solar, which was the world’s largest PV manufacturer with just 1 MW capacity at the time, or Siemens SM55, SM55-HO, and SM75 modules. Siemens became Arco Solar’s largest shareholder in 1990. The modules have rated power outputs between 48 W and 55 W and consist of a glass front sheet, ethylene-vinyl acetate (EVA) encapsulant layers, monocrystalline silicon cells, and a polymer backsheet laminate.

The test setup included on-site monitoring of AC and DC power output, ambient and module temperatures, and plane-of-array irradiance measured using pyranometers. Based on site conditions, the researchers classified the installations into low-, mid-, and high-altitude climate zones.

“For benchmarking purposes, two Siemens SM55 modules have been stored in a controlled indoor environment at the Photovoltaic Laboratory of the Bern University of Applied Sciences since the start of the monitoring campaign,” the researchers said. They also applied the multi-annual year-on-year (multi-YoY) method to determine system-level performance loss rates (PLR).

The results show that PLRs across all systems range from -0.12% to -0.55% per year, with an average of -0.24% to -0.16% per year, well below typical degradation rates reported for both older and modern PV systems. The researchers also found that higher-altitude systems generally exhibit higher average performance ratios and lower degradation rates than comparable low-altitude installations, despite exposure to higher irradiance and ultraviolet radiation.

The study further revealed that modules of the same nominal type but with different internal designs show markedly different degradation behaviour. Standard SM55 modules exhibited recurring solder bond failures, leading to increased series resistance and reduced fill factor. By contrast, SM55-HO modules benefited from a modified backsheet design that provides higher internal reflectance and improved long-term stability.

Overall, the findings indicate that long-term degradation in early-generation PV modules is driven primarily by thermal stress, ventilation conditions, and material design, rather than altitude or irradiance alone. Modules installed in cooler, better-ventilated environments demonstrated particularly stable performance over multiple decades.

The test results were presented in the paper “Three decades, three climates: environmental and material impacts on the long-term reliability of photovoltaic modules,” published in EES Solar.

“The study identified the bill-of-material (BOM) as the most critical factor influencing PV module longevity,” they concluded. “Despite all modules belonging to the same product family, variations in encapsulant quality, filler materials, and manufacturing processes resulted in significant differences in degradation rates. Early-generation encapsulants without UV stabilisation showed accelerated ageing, while later module designs with optimised backsheets and improved production quality demonstrated outstanding long-term stability.”

 

  •