Initialize CI for PullRequest Testing#4767
Initialize CI for PullRequest Testing#4767MatthiasBSchaefer wants to merge 1 commit intomodelica:masterfrom
Conversation
beutlich
left a comment
There was a problem hiding this comment.
The log is pretty verbose and it's not obvious what might be the reason for failure at first sight.
I have saome ideas:
- Can we reduce the log verbosity. Can we have a configurable log verbosity.
- Can we add problem matchers? See for example https://github.com/modelica/ModelicaStandardLibrary/blob/master/.github/moparser.json or https://oneuptime.com/blog/post/2026-01-30-github-actions-problem-matchers/view.
- Can we upload the comparison reports using actions/upload-artifact?
| name: PR_testing | ||
| on: | ||
| # Triggers the workflow on push or pull request events but only for the "main" branch | ||
| pull_request: | ||
|
|
||
| # Allows you to run this workflow manually from the Actions tab | ||
| workflow_dispatch: | ||
|
|
||
| # A workflow run is made up of one or more jobs that can run sequentially or in parallel |
There was a problem hiding this comment.
| name: PR_testing | |
| on: | |
| # Triggers the workflow on push or pull request events but only for the "main" branch | |
| pull_request: | |
| # Allows you to run this workflow manually from the Actions tab | |
| workflow_dispatch: | |
| # A workflow run is made up of one or more jobs that can run sequentially or in parallel | |
| name: CI ReSim | |
| on: | |
| pull_request: | |
| workflow_dispatch: | |
| # A workflow run is made up of one or more jobs that can run sequentially or in parallel | ||
| jobs: | ||
| prepare: | ||
| runs-on: self-hosted |
There was a problem hiding this comment.
| runs-on: self-hosted | |
| runs-on: [ self-hosted, linux, regression-testing ] |
| prepare: | ||
| runs-on: self-hosted | ||
| steps: | ||
| - name: checkout_pr_repo |
There was a problem hiding this comment.
| - name: checkout_pr_repo | |
| - name: Checkout code |
|
|
||
| testrun_modelica: | ||
| needs: prepare | ||
| runs-on: self-hosted |
There was a problem hiding this comment.
| runs-on: self-hosted | |
| runs-on: [ self-hosted, linux, regression-testing ] |
| needs: prepare | ||
| runs-on: self-hosted | ||
| environment: | ||
| name: link_modelica |
There was a problem hiding this comment.
| name: link_modelica | |
| name: Build libraries |
Not sure, what is meant by "link".
| name: link_modelica | ||
| url: https://serv.ltx.de/prs/${{ github.event.number }}/Modelica/report/PR_comparison_report.html | ||
| steps: | ||
| - name: run modelica |
There was a problem hiding this comment.
| - name: run modelica | |
| - name: Run tests |
| needs: prepare | ||
| runs-on: self-hosted | ||
| environment: | ||
| name: link_modelicatest |
There was a problem hiding this comment.
| name: link_modelicatest | |
| name: Build libraries |
There was a problem hiding this comment.
The enviroment is meant to create\show the links to the reports.
So "Report modelicatest" might be a good name
| name: link_modelicatest | ||
| url: https://serv.ltx.de/prs/${{ github.event.number }}/ModelicaTest/report/PR_comparison_report.html | ||
| steps: | ||
| - name: run modelicatest |
There was a problem hiding this comment.
| - name: run modelicatest | |
| - name: Run tests |
| runs-on: self-hosted | ||
| environment: | ||
| name: link_modelica | ||
| url: https://serv.ltx.de/prs/${{ github.event.number }}/Modelica/report/PR_comparison_report.html |
There was a problem hiding this comment.
Do not use hard-coded URIs for https://serv.ltx.de (here and in other places), instead prefer variables via https://github.com/modelica/ModelicaStandardLibrary/settings/variables/actions. This way LTX can move its server while workflow keeps untouched.
| runs-on: self-hosted | ||
| steps: | ||
| - name: checkout_pr_repo | ||
| run: /home/resimuser/regression_testing/docker/work/run_scripts/prepare_pr.sh ${{ github.event.number }} |
There was a problem hiding this comment.
For the sake of transparancy, can we add all the CI bash scripts to a new folder ReSim in .CI? Otherwise it will be pure magic for library officers. Usually and as best practice the CI scripts are maintained along with the repo under test.
There was a problem hiding this comment.
@beutlich
Thank you for your detailed review !
1.) I know, log is pretty verbose. But it is not really meant to be analyzed. Finding errors (like example Modelica.Blocks.Examples.xxx is not running anymore due to the changes in the PR) can be done in the published reports. The log is more or less to analyze possible errors in the workflow itself by us (LTX). We can reduce the logging once we are sure the workflow is stable. (without any changes inside this repository)
2.) probably we could introduce problem matchers, but may not be necessary due to 1.)
3.) We discussed this in the MAP-LIB meetings. Upload artifacts would induce the need to download the reports (which are not that small) , unzip and then open in a browser. These steps, we want to avoid, making the reports directly clickable by publishing on our server.
4.) Thank you for the very good hint with the variables instead of hard links!
5.)We can publish the main shell scripts called by the CI. But these are mainly running the Modelica tools inside docker containers. And we wont publish our regression testing tool itself, nor the docker files, due to intellectual property. So publishing the scripts is only partially helpful concerning transparency, but would induce a change of these scripts (and such a change in the MSL) in order to add a further Modelica tool (e.g. Simulation X) to the regression testing.
6.) Your proposed naming of the jobs is fine for me.
This PR adds the yml file to .github/workflows in order to start the CI for testing PullRequests with the ltx-resim tool.
infrastructure to run the CI-tests is currently hosted and maintained by LTX
see PRs #4725 #4763 #4764 #4765 , which are tests of the CI on a separate branch