Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Favorites export failed #1372

Closed
forvalak opened this issue Nov 15, 2016 · 12 comments
Closed

Favorites export failed #1372

forvalak opened this issue Nov 15, 2016 · 12 comments
Assignees
Milestone

Comments

@forvalak
Copy link

forvalak commented Nov 15, 2016

On trying to export the favorites a empty starred*.json file downloads.
I use v1.6.1 on SQLite.
Would like to change to MySQL/MariaDB and would like to keep my favorites.

There are 2 users, one of them have only few favorites <50 (the export works there).
The other one have about 5000 favorites, there could be downloaded only empty file.

Is the export limited to a certain number of favorites?
I also tryed to export with the command line, the starred file was also empty.

@Alkarex
Copy link
Member

Alkarex commented Nov 15, 2016

Hello,
There should not be any limit, especially not from the command line. There might be a PHP memory limit though, in which case you should see an error. Could you try to have a look in your logs?

@Alkarex Alkarex added this to the 1.7.0 milestone Nov 15, 2016
@forvalak
Copy link
Author

forvalak commented Nov 15, 2016

Thank you for your reply.
I execute following command:
/bin/php ./cli/export-zip-for-user.php --user snoop > /share/temp/snoop.zip

FreshRSS exporting ZIP for user “snoop”…
Result: success

After unziping there are any json files. All are not empty, but the starred file is.
No PHP-errors at all.
Server version: Apache/2.4.23 (Unix)
PHP version: PHP 7.0.9 (cli)

@Alkarex
Copy link
Member

Alkarex commented Nov 15, 2016

If the data is not too much private, you could send me the problematic SQLite database, so I could use it to fix the problem. I of course understand if this is not an option.

Could you please check:

php -i | grep memory_limit

The export/import would need a refactoring to be more memory-efficient (using streaming instead of in-memory copies).

Alkarex added a commit to Alkarex/FreshRSS that referenced this issue Nov 15, 2016
Avoid large in-memory copies
FreshRSS#1372
@Alkarex
Copy link
Member

Alkarex commented Nov 15, 2016

@forvalak I have made a patch for you #1373 which reduces very much the in-memory representations. Could you please give it a try?

The import will probably have to be updated accordingly :-)

Notes for later:

@forvalak
Copy link
Author

Thank you Alkarex!

php -i|grep memory_limit
memory_limit => 512M => 512M

I replaced the files you commited and the export works for me.
I got a JSON file 34MB with all the favorites inside.

The next problem is to import them if I try it with CLI:
/opt/Qapache/bin/php ./cli/import-for-user.php --user snoop --filename /share/Temp/freshrss_starred_2016-11-16.json
I get errors and nothing happens:
FreshRSS importing ZIP/OPML/JSON for user “snoop”…
FreshRSS error trying to import a non-JSON file
FreshRSS error during JSON stars import
Result: fail

On import in the GUI I get error too: "Error during JSON stars import"
I tried to reduce the JSON file size to import the favorites partially without success.

@Alkarex
Copy link
Member

Alkarex commented Nov 16, 2016

Thanks for trying, @forvalak
That is a good start :-)
Yes, the native PHP function json_decode() probably fails for something that large. I will have to refactor the import logic, similarly to what I have just done for export.
In the meantime, you can try doubling the memory allocated to PHP.

@Alkarex
Copy link
Member

Alkarex commented Nov 16, 2016

P.S.: Could you please check with another tool that the generated JSON is valid?
E.g. using https://stedolan.github.io/jq/

cat freshrss_starred_2016-11-16.json | jq .

@forvalak
Copy link
Author

I parsed the JSON file with jq and found two errors: there was a second comma between items like below:
{
ITEM
},
, <-!!!!!!!!!!!
{
ITEM
}
After I removed this second comma and tried to import in the GUI it works fine (without increasing the memory_limit)!
Thank you for your time!

@forvalak
Copy link
Author

P.S.: I check all the exported JSON files for errors I described above and detected the same errors in any feed_*.json files too.

@Alkarex
Copy link
Member

Alkarex commented Nov 16, 2016

Strange with the comma. Could you please try this additional patch and see whether it solves the problem? 1d5006d

@forvalak
Copy link
Author

This commit solved the comma issue :)

Thank you very much!

@Alkarex Alkarex self-assigned this Nov 17, 2016
@Alkarex
Copy link
Member

Alkarex commented Nov 17, 2016

Merged in /dev :-)

@Alkarex Alkarex closed this as completed Nov 17, 2016
@Alkarex Alkarex modified the milestones: 1.7.0, 1.6.2 Dec 21, 2016
javerous pushed a commit to javerous/FreshRSS that referenced this issue Jan 20, 2020
Avoid large in-memory copies
FreshRSS#1372
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants