Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Empty cache #491

Closed
wants to merge 15 commits into from
Closed
Changes from 8 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 13 additions & 4 deletions executorlib/interactive/shared.py
Original file line number Diff line number Diff line change
Expand Up @@ -632,7 +632,16 @@ def _execute_task_with_cache(
data_dict["output"] = future.result()
dump(file_name=file_name, data_dict=data_dict)
else:
_, result = get_output(file_name=file_name)
future = task_dict["future"]
future.set_result(result)
future_queue.task_done()
exe_flag, result = get_output(file_name=file_name)
if exe_flag:
future = task_dict["future"]
future.set_result(result)
future_queue.task_done()
else:
_execute_task(
interface=interface,
task_dict=task_dict,
future_queue=future_queue,
)
data_dict["output"] = future.result()
dump(file_name=file_name, data_dict=data_dict)
Comment on lines +635 to +647
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add error handling for cache operations and implement atomic file operations.

The cache validation logic needs improvement in several areas:

  1. File operations (dump, get_output) should have proper error handling
  2. Cache file creation/updates should be atomic to prevent race conditions
  3. Invalid cache files should be cleaned up to prevent disk space issues

Consider implementing these improvements:

-        exe_flag, result = get_output(file_name=file_name)
-        if exe_flag:
-            future = task_dict["future"]
-            future.set_result(result)
-            future_queue.task_done()
-        else:
-            _execute_task(
-                interface=interface,
-                task_dict=task_dict,
-                future_queue=future_queue,
-            )
-            data_dict["output"] = future.result()
-            dump(file_name=file_name, data_dict=data_dict)
+        try:
+            exe_flag, result = get_output(file_name=file_name)
+            if exe_flag:
+                future = task_dict["future"]
+                future.set_result(result)
+                future_queue.task_done()
+            else:
+                # Remove invalid cache file
+                os.remove(file_name)
+                _execute_task(
+                    interface=interface,
+                    task_dict=task_dict,
+                    future_queue=future_queue,
+                )
+                data_dict["output"] = future.result()
+                # Use temporary file for atomic write
+                temp_file = file_name + '.tmp'
+                dump(file_name=temp_file, data_dict=data_dict)
+                os.replace(temp_file, file_name)
+        except Exception as e:
+            future.set_exception(e)
+            future_queue.task_done()
+            raise
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
exe_flag, result = get_output(file_name=file_name)
if exe_flag:
future = task_dict["future"]
future.set_result(result)
future_queue.task_done()
else:
_execute_task(
interface=interface,
task_dict=task_dict,
future_queue=future_queue,
)
data_dict["output"] = future.result()
dump(file_name=file_name, data_dict=data_dict)
try:
exe_flag, result = get_output(file_name=file_name)
if exe_flag:
future = task_dict["future"]
future.set_result(result)
future_queue.task_done()
else:
# Remove invalid cache file
os.remove(file_name)
_execute_task(
interface=interface,
task_dict=task_dict,
future_queue=future_queue,
)
data_dict["output"] = future.result()
# Use temporary file for atomic write
temp_file = file_name + '.tmp'
dump(file_name=temp_file, data_dict=data_dict)
os.replace(temp_file, file_name)
except Exception as e:
future.set_exception(e)
future_queue.task_done()
raise

Loading