Skip to content

Why is there a large drop in performance between 2.0.5 and 2.0.6? #8019

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
1 task done
technoblogy opened this issue Apr 1, 2023 · 13 comments
Closed
1 task done

Why is there a large drop in performance between 2.0.5 and 2.0.6? #8019

technoblogy opened this issue Apr 1, 2023 · 13 comments
Assignees
Labels
Area: Performance Issue related to performance problems and improvements Type: Regression Result of unforeseen consequences of a previous change

Comments

@technoblogy
Copy link

Board

ESP32

Device Description

Adafruit ESP32 Feather

Hardware Configuration

No.

Version

v2.0.7

IDE Name

Arduino IDE

Operating System

macOS 10.13.6

Flash frequency

80 MHz

PSRAM enabled

no

Upload speed

921600

Description

I've compared benchmarks for three application programs running on the ESP32 version of uLisp, my Lisp interpreter for microcontrollers, and there's a big drop in performance between arduino-esp32 version 2.0.5 and 2.0.6 (the figures are execution times, so shorter times are better):

Version 2.0.5 2.0.6 2.0.7
Tak 8.1 s 14.2 s 13.8 s
Q2 21.2 s 38.1 s 37.2 s
FFT 241 ms 416 ms 408 ms

As you can see from the figures, 2.0.7 is a slight improvement, but it's still about 70% slower than 2.0.5.

Is there a reason for these differences, and is there anything I can do about it?

Sketch

For details of the benchmarks see [Benchmarks](http://www.ulisp.com/show?1EO1).

Debug Message

There is no error.

Other Steps to Reproduce

No response

I have checked existing issues, online documentation and the Troubleshooting Guide

  • I confirm I have checked existing issues, online documentation and Troubleshooting guide.
@technoblogy technoblogy added the Status: Awaiting triage Issue is waiting for triage label Apr 1, 2023
@mrengineer7777
Copy link
Collaborator

That's interesting. Not sure what your benchmark is doing, but I'm guessing it's processor bound work. The differences are probably due to updates in the underlying IDF.

@mrengineer7777
Copy link
Collaborator

It's also possible the compiler optimization flags have changed.

@mrengineer7777 mrengineer7777 added Area: Performance Issue related to performance problems and improvements Status: Community help needed Issue need help from any member from the Community. and removed Status: Awaiting triage Issue is waiting for triage labels Apr 1, 2023
@technoblogy
Copy link
Author

Is there some way I can check that - I assume it's in the platform.txt file?

@mrengineer7777
Copy link
Collaborator

mrengineer7777 commented Apr 1, 2023 via email

@mrengineer7777 mrengineer7777 closed this as not planned Won't fix, can't repro, duplicate, stale Apr 12, 2023
@technoblogy
Copy link
Author

Sorry, why is this closed? Is there no plan to investigate what's causing it?

@mrengineer7777
Copy link
Collaborator

If you can specify which library is causing the performance issue and can provide a simple code example to replicate it, then we can investigate. While interesting, your issue is not a bug and would take a lot of time to research.

@technoblogy
Copy link
Author

I've tried a pure C implementation of the first benchmark I used for the above comparison:

int tak (int x, int y, int z) {
  if (!(y < x)) return z;
  else return tak(tak(x-1,y,z), tak(y-1, z,x), tak(z-1,x,y));
} 

The calculation of tak(24, 16, 8) took exactly the same time (170ms) on 2.0.5, 2.0.6, and 2.0.7.

I'll have to think about what my Lisp interpreter might be doing that's affected by the different versions of the ESP32 core.

@mrengineer7777
Copy link
Collaborator

Sounds good. FYI I'm not an Espressif employee. I volunteered to help with managing their issues because they are way behind on issues and PRs. I have even fixed some issues by submitting PRs.

@dragoncoder047
Copy link

I'll have to think about what my Lisp interpreter might be doing that's affected by the different versions of the ESP32 core.

My guess from a user standpoint is that the arduino core runs freeRTOS background tasks and the uLisp evaluator calls yield() somehow (although I can't figure out where) allowing other non-uLisp stuff to run in the middle of uLisp execution. The C example clearly doesn't call yield() so no tasks switch.

@technoblogy if you want to get rid of this issue, I would maybe add a flag NOYIELD, and turn it on when in (time), and then only ever yield() if the NOYIELD flag is cleared -- similar to calling testescape() and the NOESC flag.

@technoblogy
Copy link
Author

technoblogy commented Apr 13, 2023

@dragoncoder047 thanks for the suggestions, but I think I've tracked down what the problem is. My uLisp interpreter periodically calls Serial.read() to allow the user to escape from a runaway program by entering the '~' character.

Here's a test program:

void testescape () {
  if (Serial.read() == '~') { Serial.println("escape!"); for(;;); }
}

void setup() {
  Serial.begin(9600);
  Serial.println();
  Serial.println("Start");
}

void loop() {
  unsigned long start = millis();
  for (int i=0; i<10000; i++) testescape();
  Serial.print(millis()-start); Serial.println("ms");
  for(;;);
}

and here are the timings (the figures are execution times, so shorter times are better):

Version 2.0.5 2.0.6 2.0.7
Time 71 ms 133 ms 133 ms

So the change in performance between versions of the ESP32 Arduino core could have something to do with the execution of Serial.read().

Changing the testescape() routine to:

void testescape () {
  if (Serial.available() && Serial.read() == '~') { Serial.println("escape!"); for(;;); }
}

makes the execution time for the test program 68 ms for all three versions; ie eliminates the difference.

Apologies that I reported this as a more general scare before fully tracking down what was causing it.

@mrengineer7777
Copy link
Collaborator

I wonder if it's due to this PR? #7525

@mrengineer7777 mrengineer7777 added Type: Regression Result of unforeseen consequences of a previous change Status: Needs investigation We need to do some research before taking next steps on this issue and removed Status: Community help needed Issue need help from any member from the Community. labels Apr 13, 2023
@mrengineer7777
Copy link
Collaborator

@SuGlider How hard would it be to fix this issue?

@Parsaabasi Parsaabasi removed the Status: Needs investigation We need to do some research before taking next steps on this issue label Jan 16, 2025
@Parsaabasi
Copy link

Hello,

Due to the overwhelming volume of issues currently being addressed, we have decided to close the previously received tickets. If you still require assistance or if the issue persists, please don't hesitate to reopen the ticket.

Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Area: Performance Issue related to performance problems and improvements Type: Regression Result of unforeseen consequences of a previous change
Projects
None yet
Development

No branches or pull requests

5 participants