“Solar Score” — With precise predictions we help you to get the most out of your solar plant.

TechLabs Ruhr
8 min readMay 1, 2022

This project was carried out as part of the TechLabs “Digital Shaper Program” in Dortmund (winter term 2021/22). “Solar Score” was awarded as the best project in the winter term 2021/22.

In a nutshell:

Our project helps solar plant owners to use their solar plants more efficiently. Based on the next days’ weather forecast, solar plant owners get a prediction of their plant’s power outage. The prediction is hourly, which helps in accurate planning of when to use energy-intense devices, e.g., charging an electric car.


Solar plants are becoming increasingly popular with homeowners and are a great way to generate sustainable energy. However, storing energy is still very costly for private households. Hence, the energy that is not immediately consumed is fed into the power grid, which is energy and cost-inefficient compared to using the energy directly. The idea was to find a way to use a solar plant more effectively so that the owner gets the most benefit from it.

One way to do that is to predict the future power plant outcome. However, accurate predictions are often not available to the user, as the predictions are currently rather general and often do not take into account the exact location of the solar plant. Our solution approach was to use a machine learning model to predict a solar plant’s power outage for the next few days based on the local weather forecast, the exact location of the solar plant, and its maximum power outage. Therefore, we first used existing solar data of four household solar plants in the UK and the corresponding weather data for that time period to train our prediction model. Then we used the weather forecast for the next few days on this model to predict the upcoming power outage. To make the predictions more precise, we included the solar plant’s exact location and maximum power outage in our prediction model. The user-specific information is gathered from a website where the user has to register their solar plant. The website is also used to provide the prediction data to the user.


Our approach was threefold.

  1. We had to find suitable weather and solar plant data that we could use for our prediction model.
  2. We had to create a prediction model and train it.
  3. We had to build a website where the user could register and access the prediction.

After finding suitable solar plants and weather data [2] the data had to be cleaned. Therefore data cleansing operations as Pandas offers were used. The fetching of the weather data is done with the help of the Python module wetterdienst. This is used to fetch weather data from the DWD (Deutscher Wetterdienst) [1]. Weather forecast data for 5.400 weather stations are provided by DWD. The idea is that the solar plant owner’s address is used to request weather data for the nearest weather station. To get the geo-coordinates of the solar plant owner’s address, the free service MapQuest is used. With the geo-coordinates, the nearest weather station to the respective address can be found by using a wetterdienst function.

Machine learning model:

Time series prediction is an old topic in economics and statistics. During the recent acceleration in artificial intelligence, especially in deep neural networks, forecasting of sequences was made possible with AI as well.

In comparison to convolutional techniques for image processing, this approach is kind of new, though, and we used a framework that is still under active development — tsai.

After getting used to the library fastai during the deep learning track, we fell in love with tsai because it builds on top of the well-known features of fastai.

In our case, the problem structure presented itself as follows:

  • Predict solar power generation for up to 10 days (based on weather forecasting with 1-hour resolution)
  • The forecasting horizon will consist of 10x24=240 data points
  • Weather is described with 240xf data points (“f” being the number of weather features like temperature, humidity, etc.)

Based on this knowledge, we created training sets or so-called “batches” that we could feed into the model. We, therefore, used the InceptionTimePlus model.

In order to minimize effort, we wanted the training routine to be as easy as possible. Therefore a single training set had to be created out of those given.

Early tests revealed that the sliding window approach used for creating batches fails with data having less than 24 days. That is why data merging was used to delete all intervals for which this criterion does not apply. Figure 1 shows four input weather variables along with the generated power and their data set identification used for preprocessing the batches. In the end, only nine sequences could be used, as seen via the count of stairs in “unique_id”.

Figure 1: Preprocessed training data

More than 5000 single batches (240 timesteps, 4 inputs, 1 output) were created and randomly assigned, having training or validation purposes. Each slice at the bottom of Figure 1 represents one concrete problem described above. Blue slices, as the overall majority, are used for tuning the model parameter. The greens are purely for validation, so the model will not use them within training.

After minimizing the mean absolute error or, in other words learning the necessary relationships between weather and solar power, we exported the model for later inference calls from the backend.

Web Development:

The aim was to create a website where a user could register and get a precise prediction for their solar plant’s power outcome.

Figure 2: Branding

To do that, a frontend with a signup/login form and a dashboard was needed. Furthermore, we needed a backend for handling and storing the data.


For the front end, we chose to use React. React is a commonly used JavaScript library for building fast and scalable frontends of websites. The creation of the react app was done with the help of a Youtube video from “JavaScript Mastery” [3]. Parts of the code were taken from the video; however, the structure and design were modified and extended to fit our needs. Our website consists of a landing page, a signup page, a login page, and a dashboard. For setting up the registration process and connecting it to the back end, a template was used [4].

Figure 3: Website Landing Page

A navigation bar is used to navigate to the first three mentioned pages. The dashboard is shown when the user signs up or logs in. On the dashboard page, the user sees the prediction values for their solar plant.

For this, we use Plotly.js, which is an open-source JavaScript library that can be used to visualize data easily. Additionally, a data table was created which lists the prediction data.

We used the React Data Grid from AG Grid to create the table. React Data Grid comes with various themes and is a great tool for building data tables in React.

React was also used to customize the design of our website using the .css files. However, the focus was set on the functionality of the website and not on the design.


For the backend, we chose to use Django. Django is a Python web framework that has a Model-View-Template Architecture. The connection of the front- and backend was built up using a template [5] for a simple authentication system. Login, signup, and logout were already part of the template, but modifications had to be made to connect our backend and frontend successfully. Reasons were, amongst others, the usage of a newer Django version and the connection with we react files.

The authentication process was extended so that besides the e-mail also, the user’s name, address, and maximum solar plant output must be entered and saved in the django database. The backend stores the user’s information. This way, we can access and use the data for our prediction model. Furthermore, we use the backend to store the prediction data of our prediction model. The frontend takes the data from the backend and displays the user-specific prediction data on the website using functionalities like Plotly.js and React Data Grid.


We have managed to develop a model that can use weather, location, and power data to calculate the potential power outage over the next few days. This model is integrated into a website where the user can register by providing information about their solar plant’s maximum power and location. Using this information, a plot and data table is generated that show the customized power outcome for the next days. The user can use that information to plan their power usage accordingly. By this, they can use the electricity of the solar plant more efficiently.

Figure 4: Prediction plot
Figure 5: Prediction table

Using the electricity themselves is not only more energy efficient for the owner but also more cost-efficient. Often, the price received for feeding electricity into the grid is far lower than the energy price paid for purchasing it from the grid. In addition, our solution helps the solar plant owner to become more independent of external energy sources in times when private storage solutions are still very cost-intensive, and energy prices are increasing.

A possible next step is to use the user’s actual solar plant data to train the model. So the predictions become even more precise as external factors such as site conditions are also considered. The long-term goal of the project could be to program an app that can be connected to smart home solutions. Then energy-intensive processes can be timed to start at times of high energy intensity.

Video: Demonstration of the website

GitHub repository:

Team members:

Team mentors:

  • Luise Weickhmann, Web Development
  • Tobias Küper, Data Science