GSoc/2023/StatusReports/KaranjotSingh: Difference between revisions

From KDE Community Wiki
No edit summary
Line 3: Line 3:
The Remote Eco Lab project aims to provide a streamlined process for measuring software energy consumption remotely using a CI/CD pipeline. By automating the measurement process and integrating with the OSCAR tool, developers can make informed decisions to improve code efficiency and work towards software eco-certification
The Remote Eco Lab project aims to provide a streamlined process for measuring software energy consumption remotely using a CI/CD pipeline. By automating the measurement process and integrating with the OSCAR tool, developers can make informed decisions to improve code efficiency and work towards software eco-certification


== Mentor ==
== Mentors ==


Volker Krause
Volker Krause

Revision as of 20:26, 18 August 2023

MEASURING ENERGY CONSUMPTION USING REMOTE LAB

The Remote Eco Lab project aims to provide a streamlined process for measuring software energy consumption remotely using a CI/CD pipeline. By automating the measurement process and integrating with the OSCAR tool, developers can make informed decisions to improve code efficiency and work towards software eco-certification

Mentors

Volker Krause

Benson muite

Nicolas Fella

Merge Request

The official repo of the project is : https://invent.kde.org/teams/eco/remote-eco-lab

As this project required a lot of CI Testing / CI changes, so most of the work is done on this test repo : https://invent.kde.org/drquark/kecolabtestci

Most the changes related to testing on the CI are mentioned in : https://invent.kde.org/drquark/kecolabtestci/-/merge_requests/9

To download the latest artifacts and final report : https://invent.kde.org/drquark/kecolabtestci/-/pipelines/457169

Blog Posts

Google Summer of Code : KEcolab

Timeline

Week 1 and Week 2

During the community bonding period, we set up a communication channel (https://matrix.to/#/#kde-eco-dev:kde.org) to foster open collaboration and maintain a public record of discussions. This also helped for other people to get involved in the project. Also we set up a regular internal meetups (ie on Thursdays every week) and discussed about the project to the community in the monthly meetup. During the week 1 and week 2, with the help of mentors we were able to set up the infrastructure involving setting up ssh keys, rasberry pi1, lab test PC etc. I worked upon configuring the X server on the lab test PC to run the gui application. In this week, I started working on the installing flatpak stage of the CI Pipeline as well. I also wrote a blog post to show my project to the larger community of KDE and helping them to get involved as well.

Week 3 and Week 4

I completed working on the installing flatpak stage. During these two weeks, I started working upon the energy measurement stage of the CI Pipeline. I got myself encountered with some errors related to copying files (scp) from the repository to the lab test PC, setting up environment variables. These were solved with the help of mentors and the solution was mainly straight forwarded and related to resolving all the commands in a single session. Coming to the end of the week 4 i was able to run and complete the energy measurement stage. We were able to generate the artifacts as well that involved power meter readings and hardware readings which are further passed to the next stage for analysing and generating a report.

Week 5 and Week 6

I started working on the last stage of the measurement pipeline i.e the result stage. It required me to learn R language so during my 5th week, i learned about using R and then worked upon writing preprocessing scripts for the OSCAR tool (Open source Software Consumption Analysis in R developed by Umwelt-Campus Birkenfeld students). OSCAR is very challenging tool to work with when it comes to formatting so to fix the formatting issue, it was necessary to preprocess the scripts before providing it to OSCAR. In the Preprocessing script, I mainly worked upon taking arguments from the user on the script and also converting from epoch time and formatting the date and time for OSCAR. Also I tried installing OSCAR and R on our Rasberry PI but as the process requires a lot of resources (Dependency and RAM issues) so it wasn't suitable so after discussing this with the mentors i settled of using the gitlab CI pipeline for installing and running OSCAR.

Week 7 and Week 8

During this week, I modified the OSCAR script for using it on the gitlab CI Pipeline. This involved fixing issues related to formatting for eg. There was an issue with formatting of date ie the format DD-mm-yy is different from dd-mm-yy and this was causing an error in the final result. I debugged the code line by line and took help from Achim (Developed OSCAR) to fix issues as most of these issues cause the report to has a NA in various columns thus making it difficult to render it into a report file. I also solved an issue regarding oscar not being able to convert the date and time into the POSIX time which resulted the timestamp to be Null and one thing that is worth noticing is the fact, if any one of the value gets to be NULL it results in 0 value in every column and row, This issues was similar to above one on the formatting of date. Finally after some errors with rendering and debugging it, i successfully was able to generate the report by the end of week 8. This report was generated for already tested tool Okular so I modified the script for a general input to generate report for any application and i was able to generate it for Kate

Week 9 and Week 10

So after testing the OSCAR R script locally on my system, I wrote the result stage for Gitlab CI pipeline and tested it using a tidyverse image which already had most of the dependencies installed and is much lightweight. The script uses the updated OSCAR tool from the official repo, it takes the artifacts from the previous stage and ran the OSCAR tool with the updated artifacts to generate the report. I encountered an issue regarding automating the date on the filename so that it would be able to generate the artifact. The collectl output saves the file as `test1.csv-dd-mm-yy.tab` so to download the file from the system i needed to know the current date and time and replace it with that. As i needed to handle it with both the ways from the test lab system as well as the gitlab CI pipeline. I defined variables on both the end, In the test lab system i copied the file into a new generic name `test1.csv` and on the gitlab pipeline i used a variable to export it as a artifact with a name including the date and time `test1.csv-dd-mm-yy` ( Also we require this format of the file because the preprocessing scripts takes the date and time for preprocessing from the filename. I also tried regex to successfully export the artifact like this `"test1.csv-joseph-esprimop957-*.tab.gz"`. Another issue was with the forwarding the artifacts from one job to another job which was solved by adding the dependencies field in the next job and by first copying the artifacts into the required directory ( as if we `cd` into any directory, we basically lose the default directory where the artifacts are placed ). Both of these issues were solved with the help of mentors. After this, All the three jobs ran successfully and i was able to generate an energy measurement report.

Week 11 and Week 12

For now i am migrating the code from test repository to the official remote eco lab repository and currently the work is in the progress of adding gitlab custom runners which requires help with the sysadmin side of KDE. Also I am working on documenting my work and writing a good readme for the future testers and contributors on the official repo.