Skip to content

A repo containing the benchmark program used in the LQICM senior capstone project.

License

Notifications You must be signed in to change notification settings

jweeks2023/LQICM-Benchmark

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

83 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LQICM Benchmark

A repo containing the benchmark program used in the Loop Quasi-Invariant Code Motion (LQICM) project.

Content

  • License.md: Legal stuff
  • README.md: This File!
  • benchmark.c: The benchmarking code in question.
  • CodeToTest Folder: Destination folder for all test code
  • Outputs Folder: Destination folder for outputted benchmark results

Prerequisites

  • A computer/virtual machine running Linux
    • Note: Further testing needed on other operating systems before I can confirm that it's functional on Windows/Mac
  • A C compiler (LLVM or GCC recommended)
  • Code that you want to benchmark

How To Use

Set Up

  1. Download/clone the repo to the desired location.
  2. Open the LQICM-Benchmark folder to make sure all the files from the Contents section are there.
  3. If desired, open the benchmark.c file in a text editor and change the parameters at the top of the program.
    • ITERATIONS - Number of times each file is benchmarked. (default: 20)
    • OUTPUTTYPE - The format in which the data is output to. 0 = plain, 1 = table, 2 = CSV (default: 2)
    • OUTPUTTOGETHER - Specifies whether output will be in separate files or in one big file (default: true)
      • At this time, bulk output only outputs as CSV format. This doesn't affect the summary output.
    • COMPILER - Specifies what C compiler you're using. (default: "clang-15")
    • COMPILERPATH - Specifies the filepath for the compiler above. (default: "../../../usr/bin/")
    • INPUTFOLDER - Specifies the filepath for the folder that will contain the C files that you want to benchmark (default: "./CodeToTest/")
    • OUTPUTFOLDER - Specifies the filepath for the folder that will contain all the output files that is generated by the benchmark (default: "./Outputs/")
    • OPTLEVEL - Specifies what optimization level your compiler will apply to the files being benchmarked (default: O0)
      • For more information on optimization levels, see this link
    • MAXFILENAME - The max number of characters in a filename (default: 256)
    • MAXFILEAMN - The max number of files that can be processed by the benchmark (default: 100)
  4. Compile benchmark.c.
    • Note: We recommend using clang since it comes with LLVM and is more efficient than gcc. Ultimately, you can use any C compiler you like.
    • Example using clang-15: clang-15 benchmark.c -o benchmark -lm.

Inserting Code

  1. In your file explorer, navigate to the CodeToTest folder.
  2. Insert your own test code as .c files in the folder. Each C file will be timed separately, and will output in separate output files.
    • Note: Make sure your code compiles and runs as expected before attempting to benchmark it. Failure to do so may cause inaccurate benchmark results. An example file is provided in the CodeToTest folder.

Running the Benchmark

  1. Make sure you're in the Benchmark folder and run the benchmark using the following command: ./benchmark.
  2. View results in the Outputs folder.

Reviewing Results

  • Output files will be in the format [output|overall|summary] YYYY-MM-DD HHMMSS.txt. The title of the C program that is timed is contained at the top of the file.
  • Data will be in seconds, with 6 decimals of precision.

An Example Following The Steps Above

  1. Clone the repository:
git clone https://github.com/jweeks2023/LQICM-Benchmark.git
  1. Navigate into the folder and check contents:
cd LQICM-Benchmark
ls

Expected output:

>>> benchmark.c  CodeToTest  LICENSE.md  Outputs  README.md  run.sh
  1. Open file in text editor and change parameters
//In the benchmark.c file

#define ITERATIONS 		20			//number of times each file is benchmarked (default: 20)
#define OUTPUTTYPE 		2			//sets if results will be output in a table or plain numbers; 0 = plain, 1 = table, 2 = CSV format
#define OUTPUTTOGETHER	        true			//specifies whether output will be in separate files or in one big file (default: true) (Only outputs in .csv if true)
#define COMPILER 		"clang-15"		//specifies what compiler version you're using (default: clang-15)
#define COMPILERPATH	        "../../../usr/bin/"	//filepath of the compiler defined above (default: "../../../usr/bin/")
#define INPUTFOLDER 	        "./CodeToTest/"		//filepath for folder containing files to test (default: " ./CodeToTest/")
#define OUTPUTFOLDER 	        "./Outputs/"		//filepath for folder containing results of benchmark (default: "./Outputs/")
#define OPTLEVEL 		"ALL"			//the level of optimization the code being benchmarked is (default: 0)
#define MAXFILENAME		256			//the max number of characters in a filename (default: 256)
#define MAXFILEAMN		100			//the max number of files that can be processed by the benchmark (default: 100)
  1. Compile benchmark.c
clang-15 benchmark.c -o benchmark -lm
  1. Run the executable
./benchmark

Expected Output from Console:

>>> Running benchmark for demo.c with Opt Level O0...Done!√
>>> Building summary...Done!✓
>>> Check the "Outputs" folder for results.

Example of expected Output in File: When OUTPUTTYPE is 0:

//In the output file

Benchmark Results of demo.c
Iteration Runtimes:
0.455837
0.440500
0.447806
0.445283
0.446175
0.444677
0.448382
0.441302
0.442823
0.456849
0.441625
0.450338
0.438839
0.453270
0.447044
0.451858
0.461983
0.462734
0.449385
0.441156
Mean runtime: 0.448393
Median runtime: 0.448094

When OUTPUTTYPE is 1:

//In the output file

Benchmark Results of demo.c
Iteration Runtimes:
 ID|Runtime (sec)
---|-------------
  1|0.455837
  2|0.440500
  3|0.447806
  4|0.445283
  5|0.446175
  6|0.444677
  7|0.448382
  8|0.441302
  9|0.442823
 10|0.456849
 11|0.441625
 12|0.450338
 13|0.438839
 14|0.453270
 15|0.447044
 16|0.451858
 17|0.461983
 18|0.462734
 19|0.449385
 20|0.441156
Mean runtime: 0.448393
Median runtime: 0.448094

When OUTPUTTYPE is 2:

//In the output file

demo.c
Iterations, Runtime
1,0.430756
2,0.431765
3,0.437988
4,0.438918
5,0.440099
6,0.440395
7,0.441202
8,0.441451
9,0.442192
10,0.442760
11,0.442773
12,0.444527
13,0.445683
14,0.447284
15,0.447338
16,0.447595
17,0.448961
18,0.450032
19,0.450199
20,0.454781
Mean:, 0.443335
Median:, 0.443650

When OUTPUTTOGETHER is true:

//In the output file

Filename,demo.c,
Opt Level,O0
1,0.524322,
2,0.531562,
3,0.538903,
4,0.550172,
5,0.552873,
6,0.553342,
7,0.557133,
8,0.571227,
9,0.573769,
10,0.576032,
11,0.594302,
12,0.595006,
13,0.609555,
14,0.624552,
15,0.642659,
16,0.649558,
17,0.662863,
18,0.717895,
19,0.723979,
20,0.752221,
Avg Runtime (sec),0.605096,
Median Runtime (sec),0.594654,

Common Bugs

  • sh: 1: [OBJECT OR FILE PATH]: not found - Most likely the compilation of the C file being benchmarked failed. Check the COMPILERPATH parameter to make sure it points to the same folder COMPILER exists in.

  • File skipped due to errors/warnings - This is most likely an error with your C file. Make sure you can compile and run your code through your compiler before putting it in the CodeToTest folder.

  • Command '[COMPILER]' not found, but can be installed with: - This means that you are referencing a compiler that you do not have installed and/or does not exist in the path defined at COMPILERPATH. This issue is common when you install LLVM, as many commands require the command followed by -[VERSION NUMBER]. This is why the default value for COMPILER is clang-15. Please verify you've installed a C compiler and that you are referencing the correct version in the command in the correct location.

References

Moyen, J.-Y. et al. 2017. Loop Quasi-Invariant Chunk Detection. Automated Technology for Verification and Analysis. (2017), 91–108.

About

A repo containing the benchmark program used in the LQICM senior capstone project.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages