Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Project dependencies may have API risk issues #99

Open
PyDeps opened this issue Oct 26, 2022 · 0 comments
Open

Project dependencies may have API risk issues #99

PyDeps opened this issue Oct 26, 2022 · 0 comments

Comments

@PyDeps
Copy link

PyDeps commented Oct 26, 2022

Hi, In MillionHeroAssistant, inappropriate dependency versioning constraints can cause risks.

Below are the dependencies and version constraints that the project is using

--index-urlhttps://pypi.tuna.tsinghua.edu.cn/simple
beautifulsoup4==4.6.0
certifi==2017.11.5
chardet==3.0.4
future==0.16.0
idna==2.6
jieba==0.39
lxml==4.1.1
macholib==1.9
pefile==2017.11.5
Pillow==5.0.0
PyYAML==3.12
requests==2.18.4
scikit-image==0.13.1
scipy==1.0.0
selenium==3.8.1
six==1.11.0
urllib3==1.22
numpy==1.14.0
baidu-aip==2.0.0.1

The version constraint == will introduce the risk of dependency conflicts because the scope of dependencies is too strict.
The version constraint No Upper Bound and * will introduce the risk of the missing API Error because the latest version of the dependencies may remove some APIs.

After further analysis, in this project,
The version constraint of dependency beautifulsoup4 can be changed to >=4.10.0,<=4.11.1.
The version constraint of dependency jieba can be changed to >=0.36,<=0.36.2.
The version constraint of dependency Pillow can be changed to ==9.2.0.
The version constraint of dependency Pillow can be changed to >=2.0.0,<=9.1.1.
The version constraint of dependency requests can be changed to >=0.2.1,<=0.2.3.
The version constraint of dependency requests can be changed to >=0.7.0,<=2.24.0.
The version constraint of dependency requests can be changed to ==2.26.0.
The version constraint of dependency scikit-image can be changed to >=0.8.0,<=0.19.3.
The version constraint of dependency baidu-aip can be changed to >=1.6.2.0,<=4.16.1.

The above modification suggestions can reduce the dependency conflicts as much as possible,
and introduce the latest version as much as possible without calling Error in the projects.

The invocation of the current project includes all the following methods.

The calling methods from the beautifulsoup4
bs4.BeautifulSoup
The calling methods from the jieba
jieba.enable_parallel
jieba.initialize
jieba.posseg.cut
jieba.load_userdict
The calling methods from the Pillow
PIL.Image.open
The calling methods from the requests
requests.post
requests.get
The calling methods from the scikit-image
skimage.morphology.remove_small_objects
The calling methods from the baidu-aip
aip.AipOcr.basicAccurate
aip.AipOcr
aip.AipOcr.basicGeneral
The calling methods from the all methods
logging.getLogger.addHandler
json.loads
core.crawler.crawl.kwquery
aip.AipOcr.basicGeneral
core.crawler.html_tools.get_html_bingwd
threading.Thread
os.system
requests.get
results.find.find.get_text
utils.backup.upload_to_cloud
get_rid_of_x
results.find.find_all.find_all
core.crawler.crawl.jieba_initialize
final.append
parse_args
prompt_message
get_text_from_image
operator.itemgetter
resp.text.count.values
platform.system
list
multiprocessing.Event.set
open
jieba.initialize
sys.exit
platform.system.upper
soup_baidu
keyword_queue.get
urllib.parse.quote
jieba.posseg.cut
sorted_lists2.append
requests.post
open.close
soup_bing.find.get_text
core.android.get_adb_tool
utils.backup.save_question_answers_to_file
image.convert.convert
baidu_count
zip
core.crawler.html_tools.get_html_bingwd.find
int
headers.url.requests.get.content.decode
noticer.clear
queue.Queue
region.resize.resize
bl.get_text
soup_bing.find.find
core.crawler.html_tools.get_html_baidu
results.find.find
join.replace
datetime.datetime.today
baker.write
utils.stdout_template.BAIDU_TPL.format
selenium.webdriver.ChromeOptions.add_argument
core.crawler.html_tools.get_html_zhidao.find
logging.handlers.WatchedFileHandler
image.convert.crop
data.append
selenium.webdriver.Chrome
join.split
core.android.analyze_current_screen_text
writer.send
aip.AipOcr
s.extract
soup_zhidao
datetime.datetime.now.strftime.join
multiprocessing.Process
multiprocessing.Event.clear
subprocess.Popen
best.span.get_text
noticer.wait
datetime.datetime.now
aip.AipOcr.basicAccurate
results.attrs.__contains__
kwquery
process.stdout.read
resp.text.count
time.time
len
utils.stdout_template.KNOWLEDGE_TPL.format
r.get_text.strip.replace.replace
bl.find.find
range
r.get_text.replace.strip.get_text
question.replace.replace
os.path.join
self.read
core.crawler.html_tools.get_html_baike
bl.find
logging.handlers.WatchedFileHandler.setFormatter
exchage_queue.get
str
logging.getLogger
sentences.append
question.split
r.get_text.replace.strip.find
sorted
skimage.morphology.remove_small_objects
argparse.ArgumentParser
functools.partial
fp.read
core.crawler.html_tools.get_html_sougo.find_all
multiprocessing.Process.start
noticer.is_set
reader.read
datetime.datetime.now.strftime
r.get_text.strip.replace
e.strip
self.__message_queue.get
get_adb_tool
aip.AipOcr.setConnectionTimeoutInMillis
soup_bing.find.find_all
core.check_words.parse_false
random.choice
answer.append
core.crawler.html_tools.get_html_zhidao
capture_screen
r.get_text.replace
core.crawler.pmi.baidu_count
browser_init.find_element_by_id
browser_init.quit
argparse.ArgumentParser.parse_args
chr
self.__message_queue.put
results.find.find.get_text.__contains__
parse_answer_area
all
jieba.load_userdict
bingbaike.find_all.find
outputqueue.put
keywords.append
format
r.get_text.strip
os.path.splitext
bs4.BeautifulSoup
obj_dtec_img.np.where.max
numpy.sum
browser_search
core.crawler.text_process.postag
core.crawler.html_tools.get_html_baidu.find
input
closer.is_set
qa_li.items
os.path.isfile
sougou_count
bingwd_soup.find.find
map
multiprocessing.Event
real_question.replace.replace
writer.write
results.attrs.replace
textwrap.wrap
argparse.ArgumentParser.add_argument
soup_bingwd
reversed
line.strip.split
r.get_text.replace.strip
bl.get_text.__contains__
jieba.enable_parallel
region.resize.save
save_shot_filename.Image.open.load
k.flag.__contains__
a.rsplit
reader.close
capture_screen_v2
soup_baike
obj_dtec_img.np.where.min
browser_init.get
enumerate
writer.close
browser.find_element_by_id.clear
__inner_job
final_none.append
yaml.safe_load
resp.text.index
element.find_all
threading.Thread.start
print
best.find
resp.text.index.items
keyword_exchange.recv
browser.find_element_by_id.send_keys
binary_screenshot.replace.replace
platform.system.upper.startswith
time.sleep
utils.process_stdout.ProcessStdout
join
PIL.Image.open
core.crawler.html_tools.get_html_baike.find
logging.Formatter
multiprocessing.Pipe
browser_init
answers.split.split
check_screenshot
zhidao_soup.find.find
line.strip.strip
core.crawler.html_tools.get_html_sougo
core.android.save_screen
utils.backup.get_qa_list
open.readlines
shutil.copyfile
main
parse_question_and_answer
core.crawler.pmi.baidu_count.items
os.remove
logging.getLogger.error
multiprocessing.cpu_count
core.crawler.html_tools.get_html_bing.find
text.count
stdoutpipe.put
numpy.where
multiprocessing.freeze_support
resp.text.count.items
core.crawler.html_tools.get_html_bing
selenium.webdriver.ChromeOptions
target_list.items
results.find.find_all
get_area_data
core.android.check_screenshot
numpy.array

@developer
Could please help me check this issue?
May I pull a request to fix it?
Thank you very much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant