Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected Error #16

Closed
edzob opened this issue Aug 6, 2024 · 4 comments
Closed

Unexpected Error #16

edzob opened this issue Aug 6, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@edzob
Copy link

edzob commented Aug 6, 2024

Not certain what this is, it happend and it stated to report :)

UNEXPECTED ERROR:
| Uncaught exception when checking URL resolution
| Entry = dietz_enterprise_2015
| URL = https://dspace.cvut.cz/bitstream/10467/62442/1/Technick%c3%a1%20zpr%c3%a1va_Dietz_Hoogervorst_2015.pdf
|
| As a result, this URL will NOT be added to the entry
|
| Traceback (most recent call last):
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/bibtex/fields.py", line 136, in slow_check
| return checker.query() is not None
| ^^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/lookups/abstract_base.py", line 114, in query
| return super().query()
| ^^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/lookups/abstract_base.py", line 92, in query
| data = self.get_data()
| ^^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/lookups/https.py", line 234, in get_data
| data = super().get_data()
| ^^^^^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/lookups/https.py", line 150, in get_data
| connection.request(
| File "/usr/lib/python3.12/http/client.py", line 1336, in request
| self._send_request(method, url, body, headers, encode_chunked)
| File "/usr/lib/python3.12/http/client.py", line 1347, in _send_request
| self.putrequest(method, url, **skips)
| File "/usr/lib/python3.12/http/client.py", line 1181, in putrequest
| self._validate_path(url)
| File "/usr/lib/python3.12/http/client.py", line 1281, in _validate_path
| raise InvalidURL(f"URL can't contain control characters. {url!r} "
| http.client.InvalidURL: URL can't contain control characters. '/bitstream/handle/10467/62442/Technická zpráva_Dietz_Hoogervorst_2015.pdf;jsessionid=E20F76D164A50204529D5BE73B7A4447?sequence=1' (found at least ' ')
|
| You can report this bug at https://github.com/dlesbre/bibtex-autocomplete/issues
|

@article{dietz_enterprise_2015, title = {Enterprise Engineering theories Introduction and overview}, abstract = {In order to illustrate the basic idea of the Ciao! Network concerning the development of the discipline of enterprise engineering ({EE}), the tree metaphor is presented. The roots of the tree are the theories, the trunk contains the methodologies built on these roots, and the leafs and flowers stand for the flourishing enterprises that are achieved by applying the methodologies. The common theoretical basis for establishing {EE}, is the Ciao! paradigm that has its origins in the communication-centric view on information systems (engineering) that emerged around 2000. It replaces the information-centric view, which increasingly fails to support the theory and practice of information systems engineering effectively. The Ciao! paradigm provides a coherent and integrated understanding of these four core notions: communication, information, action, and organisation. After the discussion of the paradigm, the current seven {EE} theories are discussed briefly, after having been ordered in an appropriate classification scheme.}, pages = {1--16}, journaltitle = {unknown journal}, author = {Dietz, Jan L.G. and Hoogervorst, Jan A.P.}, date = {2015-01}, file = {Dietz and Hoogervorst - 2015 - Enterprise Engineering theories Introduction and o.pdf:/home/edzob/Zotero/storage/XN4QBRTH/Dietz and Hoogervorst - 2015 - Enterprise Engineering theories Introduction and o.pdf:application/pdf}, }

@edzob
Copy link
Author

edzob commented Aug 6, 2024

Another error. not certain if this is related.
might be new issue

collaboration_estimating_2015: UNEXPECTED ERROR:
| Uncaught exception when trying to autocomplete entry
| Entry = collaboration_estimating_2015
| Website = s2
|
| Traceback (most recent call last):
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/core/threads.py", line 67, in run
| result = lookup.query()
| ^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/lookups/multiple_mixin.py", line 35, in query
| value = super().query()
| ^^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/lookups/abstract_base.py", line 94, in query
| return self.process_data(data)
| ^^^^^^^^^^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/lookups/search_mixin.py", line 91, in process_data
| score = self.match_score(entry, res)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/lookups/search_mixin.py", line 103, in match_score
| return self.entry.matches(entry)
| ^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/bibtex/entry.py", line 242, in matches
| score = self.get_field(field).matches(other.get_field(field))
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/bibtex/base_field.py", line 97, in matches
| return self.match_values(self.value, other.value)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/bibtex/base_field.py", line 243, in match_values
| return cls.match_values_fast(a, b)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/root/.local/share/pipx/venvs/bibtexautocomplete/lib/python3.12/site-packages/bibtexautocomplete/bibtex/base_field.py", line 291, in match_values_fast
| set_b = set(b)
| ^^^^^^
| TypeError: unhashable type: 'Author'
|
| You can report this bug at https://github.com/dlesbre/bibtex-autocomplete/issues
|

@article{collaboration_estimating_2015, title = {Estimating the reproducibility of psychological science}, volume = {349}, issn = {0036-8075}, url = {http://eprints.lse.ac.uk/65159/1/__lse.ac.uk_storage_LIBRARY_Secondary_libfile_shared_repository_Content_Kappes%2C%20H_Estimating%20reproducibility_Kappes_Estimating%20the%20reproducibility_2016.pdf}, doi = {10.1126/science.aac4716}, abstract = {One of the central goals in any scientific endeavor is to understand causality. Experiments that seek to demonstrate a cause/effect relation most often manipulate the postulated causal factor. Aarts et al. describe the replication of 100 experiments reported in papers published in 2008 in three high-ranking psychology journals. Assessing whether the replication and the original experiment yielded the same result according to several criteria, they find that about one-third to one-half of the original findings were also observed in the replication study.Science, this issue 10.1126/science.aac4716INTRODUCTIONReproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research of exemplary quality may have irreproducible empirical findings because of random or systematic error.{RATIONALE} There is concern about the rate and predictors of reproducibility, but limited evidence. Potentially problematic practices include selective reporting, selective analysis, and insufficient specification of the conditions necessary or sufficient to obtain the results. Direct replication is the attempt to recreate the conditions believed sufficient for obtaining a previously observed finding and is the means of establishing reproducibility of a finding with new data. We conducted a large-scale, collaborative effort to obtain an initial estimate of the reproducibility of psychological science.{RESULTSWe} conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. There is no single standard for evaluating replication success. Here, we evaluated reproducibility using significance and P values, effect sizes, subjective assessments of replication teams, and meta-analysis of effect sizes. The mean effect size (r) of the replication effects (Mr = 0.197, {SD} = 0.257) was half the magnitude of the mean effect size of the original effects (Mr = 0.403, {SD} = 0.188), representing a substantial decline. Ninety-seven percent of original studies had significant results (P \< .05). Thirty-six percent of replications had significant results; 47\% of original effect sizes were in the 95\% confidence interval of the replication effect size; 39\% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68\% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.{CONCLUSIONNo} single indicator sufficiently describes replication success, and the five indicators examined here are not the only ways to evaluate reproducibility. Nonetheless, collectively these results offer a clear conclusion: A large portion of replications produced weaker evidence for the original findings despite using materials provided by the original authors, review in advance for methodological fidelity, and high statistical power to detect the original effect sizes. Moreover, correlational evidence is consistent with the conclusion that variation in the strength of initial evidence (such as original P value) was more predictive of replication success than variation in the characteristics of the teams conducting the research (such as experience and expertise). The latter factors certainly can influence replication success, but they did not appear to do so here.Reproducibility is not well understood because the incentives for individual scientists prioritize novelty over replication. Innovation is the engine of discovery and is vital for a productive, effective scientific enterprise. However, innovative ideas become old news fast. Journal reviewers and editors may dismiss a new test of a published idea as unoriginal. The claim that “we already know this” belies the uncertainty of scientific evidence. Innovation points out paths that are possible; replication points out paths that are likely; progress relies on both. Replication can increase certainty when findings are reproduced and promote innovation when they are not. This project provides accumulating evidence for many findings in psychological research and suggests that there is still more work to do to verify whether we know what we think we know.Original study effect size versus replication effect size (correlation coefficients).Diagonal line represents replication effect size equal to original effect size. Dotted line represents replication effect size of 0. Points below the dotted line were effects in the opposite direction of the original. Density plots are separated by significant (blue) and nonsignificant (red) effects.Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47\% of original effect sizes were in the 95\% confidence interval of the replication effect size; 39\% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68\% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.}, number = {6251}, journaltitle = {Science}, author = {Collaboration, Open Science and Aarts, Alexander A. and Anderson, Joanna E. and Anderson, Christopher J. and Attridge, Peter R. and Attwood, Angela and Axt, Jordan and Babel, Molly and Bahnik, Stepan and Baranski, Erica and Barnett-Cowan, Michael and Bartmess, Elizabeth and Beer, Jennifer and Bell, Raoul and Bentley, Heather and Beyan, Leah and Binion, Grace and Borsboom, Denny and Bosch, Annick and Bosco, Frank A. and Bowman, Sara D. and Brandt, Mark J. and Braswell, Erin and Brohmer, Hilmar and Brown, Benjamin T. and Brown, Kristina and Bruening, Jovita and Calhoun-Sauls, Ann and Callahan, Shannon P. and Chagnon, Elizabeth and Chandler, Jesse and Chartier, Christopher R. and Cheung, Felix and Christopherson, Cody D. and Cillessen, Linda and Clay, Russ and Cleary, Hayley and Cloud, Mark D. and Cohn, Michael and Cohoon, Johanna and Columbus, Simon and Cordes, Andreas and Costantini, Giulio and Hartgerink, Chris and Krijnen, Job and Nuijten, Michele B. and van 't Veer, Anna E. and Van Aert, Robbie and van Assen, M.A.L.M. and Wissink, Joeri and Zeelenberg, Marcel and Rahal, R.M.}, date = {2015}, note = {Publisher: American Association for the Advancement of Science ({AAAS})}, keywords = {Science}, file = {Full Text:/home/edzob/Zotero/storage/ZFTSHVBX/Collaboration et al. - 2015 - Estimating the reproducibility of psychological sc.pdf:application/pdf}, }

@dlesbre dlesbre added the bug Something isn't working label Aug 7, 2024
@dlesbre
Copy link
Owner

dlesbre commented Aug 7, 2024

Thanks for reporting these! They are indeed two different bugs but they shouldn't be too hard to fix. I'll look into them.

@edzob
Copy link
Author

edzob commented Aug 7, 2024

If I can help with test runs or more debug let me know.
Love your work!

dlesbre added a commit that referenced this issue Aug 7, 2024
dlesbre added a commit that referenced this issue Aug 7, 2024
@dlesbre
Copy link
Owner

dlesbre commented Aug 7, 2024

Both of these bugs were fixed in the latest release (v1.3.3).

@dlesbre dlesbre closed this as completed Aug 7, 2024
dlesbre added a commit that referenced this issue Aug 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants