-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove keys.php
to use a tooltip?
#18
Comments
As last two days the no-key service is using more than the quota of all keys, this issue is prioritized. Giving tools and tips for searching keys on the web may be useful. Could show how to contribute with a short video with a Google account. |
Such a tool would also be useful for the instance host as it would allow him to cleanly add a YouTube Data API v3, as currently by hand have to pay attention to not screw up |
Once done, change this StackOverflow answer to propose my no-key service, but as it is currently running out of quota, I am not advertising it. Done. |
Note that a fresh instance will display for the no-key service: Related to #19. |
Could make metrics, such as Have added for the moment https://yt.lemnoslife.com/metrics/ #19 is a bit blocking this. |
Can test many keys with this Python script: `test_youtube_data_api_v3_keys.py`:import requests
import json
from tqdm import tqdm
# Assume keys to be unique.
with open('keys.txt') as f:
keys = f.read().splitlines()
URL = 'https://www.googleapis.com/youtube/v3/channels'
params = {
'forHandle': '@MrBeast',
'fields': 'items/id',
}
workingKeys = []
for key in tqdm(keys):
params['key'] = key
data = requests.get(URL, params).json()
try:
if data['items'][0]['id'] == 'UCX6OQ3DkcsbYNE6H8uQQuVA':
workingKeys += [key]
except KeyError:
pass
'''
quotaExceeded = 'quota' in content
if not 'error' in data or quotaExceeded:
print(key, quotaExceeded)
'''
print(workingKeys) The equivalent parallel algorithm to be faster:import requests
import json
from tqdm import tqdm
from multiprocessing import Pool
# Assume keys to be unique.
with open('keys.txt') as f:
keys = f.read().splitlines()
URL = 'https://www.googleapis.com/youtube/v3/channels'
def testYouTubeDataApiV3Key(youTubeDataApiV3Key):
params = {
'forHandle': '@MrBeast',
'fields': 'items/id',
'key': youTubeDataApiV3Key,
}
data = requests.get(URL, params).json()
try:
if data['items'][0]['id'] == 'UCX6OQ3DkcsbYNE6H8uQQuVA':
return youTubeDataApiV3Key
except KeyError:
return
with Pool(10) as p:
workingKeys = set(tqdm(p.imap(testYouTubeDataApiV3Key, keys), total = len(keys)))
workingKeys.remove(None)
print(workingKeys) |
I am set up a test at 9:01 AM UTC+2 (as at 9:00 AM we aren't running out of quota anymore) to test all YouTube Data API v3 keys that have currently exceeded their quota. If all keys pass this test, then maybe could allow keys having exceeding quota to be added. However someone could fill keys with manually set 0 quota limit... The tests at 9:01 AM UTC+2 only returned exceeded quota. Will give a try at 10:01 AM UTC+2, otherwise should try every minute and if not passed the test once, then the key is definitely useless. Started the every minute test for all keys at Sat Oct 22 17:34:23 CEST 2022. None of the keys were useful for a single request during 24 hours. |
Could advertise the possibility to share a YouTube Data API v3 key when the no-key service is running out of quota. This should be done at this line of code. |
Setting up myself a notification system in case a or multiple check fails happen may make sense. Adding to metrics the delta logs since last retrieve (requiring authentication). Could precise the error on I added a notification system for each fail for the moment. However if for some reason, such as not enough disk space, making the system unable to write anymore logs, my check doesn't take into account such an absence of additional logs. |
Check Apache 2 logs to see if some people shared their API keys by mistake. find -name 'yt.lemnoslife.com-ssl--access.log*' (gunzip -c 'yt.lemnoslife.com-ssl--access.log.*.gz' && cat yt.lemnoslife.com-ssl--access.log{,.1}) | grep AIzaSy | grep -v addKey It is safe to add non existing files to the command above as there is a warning on
|
When adding a new key, make sure to make a backup, as if there isn't any space left on the device, we lose them all. It just happened... Adding a tool to monitor disk space usage would make sense. https://yt.lemnoslife.com/noKey/videos?part=snippet&id=B-gHb2gPGIs returns for instance: The request is missing a valid API key.:{
"error": {
"code": 403,
"message": "The request is missing a valid API key.",
"errors": [
{
"message": "The request is missing a valid API key.",
"domain": "global",
"reason": "forbidden"
}
],
"status": "PERMISSION_DENIED"
}
} |
Incident temporarily resolved, as brought back a set of keys, but haven't restored yet all keys. import requests
def getURLContent(url):
return requests.get(url).text
for key in keys:
print(key)
url = f'https://yt.lemnoslife.com/addKey.php?key={key}'
result = getURLContent(url)
print(result) |
Isn't there a way in PHP to keep a variable around across user HTTPS requests? That way we wouldn't read and write a file everytime we switch from a key to the other and so we wouldn't have faced this problem. |
Note that the disk space seems mostly used by errors in Example of filled logs (file size decreasing order):
Moved from Have to wait logs to be rotated to download and use fresh empty files to see if my modification was a good change. |
From Google account credentials can generate a YouTube Data API v3 key from a random project just by using curl? I think that due to 2FA (by default with Google) etc it isn't worth it. |
May think about recoding some of YouTube Data API v3 features by reverse-engineering their YouTube UI, if we aren't able to face the many requests using quota for the no-key service. |
Could add an email linked to the key added, if need to contact the key holder for future modification in the policy. |
Could use supervariable from a HTTPS request to the other or something like that to avoid reading a file for each request for counting no-key service keys or git commit version used for instance or could at least simplify the file content we really need like: $keysCountFile = '/var/www/ytPrivate/keysCount.txt';
$keysCount = file_get_contents($keysCountFile); |
As described in #48, proceeded at 11:40 PM UTC+1 to |
Next time we are really running out of quota advertise with a |
Should add a mechanism to |
At 20:43 I got:
I tested just following this event the no-key endpoint on the three instances and everything was working fine. Logging what's wrong could be interesting in the case that it happens again. |
Once will have |
Could also make web server logs search for YouTube Data API v3 keys be executed on private instances, as all its users don't seem be comfortable with this subject. |
Should clean inter-instance key and other instances synchronization otherwise disabling the ability for anyone to provide a key seems to make sense. |
https://web.archive.org/web/20160828004328/https://developers.google.com/youtube/v3/getting-started https://web.archive.org/web/20160404033352/https://developers.google.com/youtube/v3/getting-started is the most recent to snapshot to April 20, 2016 but does not mention how many quota is provided by default. |
Does API explorer provides unlimited quota? curl -s "https://content-youtube.googleapis.com/youtube/v3/search?part=snippet&q=test&key=AIzaSyBXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX" Output:{
"error": {
"code": 403,
"message": "Requests from referer \u003cempty\u003e are blocked.",
"errors": [
{
"message": "Requests from referer \u003cempty\u003e are blocked.",
"domain": "global",
"reason": "forbidden"
}
],
"status": "PERMISSION_DENIED",
"details": [
{
"@type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "API_KEY_HTTP_REFERRER_BLOCKED",
"domain": "googleapis.com",
"metadata": {
"consumer": "projects/292824132082",
"service": "youtube.googleapis.com"
}
}
]
}
} minimizeCURL curl.sh 'youtube#searchResult' Output:
https://console.cloud.google.com/apis/api/youtube.googleapis.com/quotas?project=my-project-XXXXXXXXXXXXX is not up-to-date in realtime, so let us make as many requests as possible and count them. |
Maybe it expires quickly but thanks to web-scraping can easily recreate one. |
counter=0
while [ 1 ]
do
echo "counter: $counter"
curl -s "https://content-youtube.googleapis.com/youtube/v3/search?part=snippet&q=$counter&key=AIzaSyBXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX" -H 'X-Origin: https://explorer.apis.google.com' | jq '.items | length'
((counter++))
#break
done leads to counter more than hundreds while having returned length still being default 5. Same with https://www.googleapis.com/youtube/v3/search. If necessary could also investigate OAuth and maybe use an account for each of these 4 cases (OAuth/key and URL) because of quota display delay. |
As may add a form in the future to enable people to share their YouTube Data API v3 developer keys, this webpage could be used for this even if a short advertisement for it could be added to
index.php
. Should proceed to #17 before proceeding to this issue as adding keys may not be necessary with current quota usage.The text was updated successfully, but these errors were encountered: