Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] 400 Bad Request when working with a larger BOM #2568

Closed
1 of 2 tasks
alexisvl opened this issue Jan 24, 2022 · 8 comments · Fixed by #2603
Closed
1 of 2 tasks

[BUG] 400 Bad Request when working with a larger BOM #2568

alexisvl opened this issue Jan 24, 2022 · 8 comments · Fixed by #2603
Labels
api Relates to the API bug Identifies a bug which needs to be addressed import / export Data importing, exporting and processing
Milestone

Comments

@alexisvl
Copy link

Describe the bug

"Bad Request (400)" when trying to upload a BOM with 100 lines. (At least, I think it's size-related).

Steps to Reproduce

Steps to reproduce the behavior:

  1. Upload a large-ish (100 lines) BOM from CSV
  2. Make part selections
  3. Submit Selections

I can share the CSV but not sure how much use that'd be without the matching part database.

Expected behavior

The BOM is uploaded.

Instead, I get a "Bad Request (400)".

If the server is run in the foreground, I get

inventree-server    | The number of GET/POST parameters exceeded settings.DATA_UPLOAD_MAX_NUMBER_FIELDS.
inventree-server    | Bad Request: /part/42/bom-upload

It feels like the issue is obvious - DATA_UPLOAD_MAX_NUMBER_FIELDS needs to be increased - but I don't even know where to find it. I'm not a sysadmin I just want to track inventory lol

Frankly I think this is two bugs. There is no context delivered with this 400 bad request, and no errors are logged to the admin interface. Without running the server in the foreground there is zero feedback. There seems to be a huge chunk of error reporting tooling that is missing.

Deployment Method

  • Docker
  • Bare Metal

Version Information

InvenTree-Version: 0.5.4
Django Version: 3.2.5
Commit Hash: 66a08fe
Commit Date: 2021-11-03
Database: postgresql
Debug-Mode: False
Deployed using Docker: True

@alexisvl alexisvl added bug Identifies a bug which needs to be addressed question This is a question labels Jan 24, 2022
@github-actions
Copy link
Contributor

Welcome to InvenTree! Please check the contributing docs on how to help.\nIf you experience setup / install issues please read all install docs.

@Zontex
Copy link

Zontex commented Jan 24, 2022

Following, I'm having the same issue on bare metal installation with 142 lines BOM

@matmair
Copy link
Member

matmair commented Jan 24, 2022

Hi there this is loosely connected to #2331 and being worked on. If you want to increase the number of possible fields increase the setting. This might open you up increased risk of DDOS attacks, so please do not do this if you expose your instance to the web directly.

@matmair
Copy link
Member

matmair commented Jan 24, 2022

@alexisvl as this is a security concern, the error code is as general as possible. The message would appear in the logs if the verbosity level of the logs is set lower.
You can add the desired value to Inventree/setttings.py with the key DATA_UPLOAD_MAX_NUMBER_FIELDS and value of how many fields you would like.

Some background for this setting is provided here:
https://docs.djangoproject.com/en/4.0/ref/settings/#data-upload-max-number-fields

@matmair matmair added api Relates to the API import / export Data importing, exporting and processing and removed question This is a question labels Jan 24, 2022
@matmair matmair added this to the 0.6.0 milestone Jan 24, 2022
@eeintech
Copy link
Contributor

@matmair Can we make it a higher number by default?

@matmair
Copy link
Member

matmair commented Jan 24, 2022

@eeintech I do not recommend that due to security risks loading that much multiform. I am refactoring rn, it is a bit trickier than thought, might start over.
Users that can not wait a week can adjust their setting, I will try to get the proper new code into 0.6.0

@Zontex
Copy link

Zontex commented Jan 24, 2022

@eeintech I do not recommend that due to security risks loading that much multiform. I am refactoring rn, it is a bit trickier than thought, might start over. Users that can not wait a week can adjust their setting, I will try to get the proper new code into 0.6.0

Thank you for your contribution, I figured out a quicker way would be to make a python script to push those automatically one by one using the API, doesn't have a limit and works pretty well.

@matmair
Copy link
Member

matmair commented Jan 24, 2022

@Zontex the rewrite is already on the roadmap but glad you found a way to get it going that fast!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api Relates to the API bug Identifies a bug which needs to be addressed import / export Data importing, exporting and processing
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants