Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: added contributors page #1122

Closed
wants to merge 2 commits into from
Closed

feat: added contributors page #1122

wants to merge 2 commits into from

Conversation

AceTheCreator
Copy link
Member

@AceTheCreator AceTheCreator commented Nov 21, 2022

The contributor's page implementation

Link to preview https://deploy-preview-1122--asyncapi-website.netlify.app/community/contributors

@netlify
Copy link

netlify bot commented Nov 21, 2022

Deploy Preview for asyncapi-website ready!

Built without sensitive environment variables

Name Link
🔨 Latest commit e80c68b
🔍 Latest deploy log https://app.netlify.com/sites/asyncapi-website/deploys/638e0c10283f10000a8649c8
😎 Deploy Preview https://deploy-preview-1122--asyncapi-website.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site settings.

@Mayaleeeee
Copy link
Member

Hey @AceTheCreator, great work on this. But I feel we should add a short introduction on how to become a contributor and get started; what do you think about it?
And another thing, the spacing between the cards is too small.

Well done.

@derberg
Copy link
Member

derberg commented Nov 24, 2022

is it really ready for review?

  • it doesn't match initial design
  • we really need backend first. This is not simple data that you can just grab with GitHub API. There are dedicated archive projects that we should reuse

@AceTheCreator
Copy link
Member Author

is it really ready for review?

  • it doesn't match initial design
  • we really need backend first. This is not simple data that you can just grab with GitHub API. There are dedicated archive projects that we should reuse

@AceTheCreator
Copy link
Member Author

is it really ready for review?

  • it doesn't match initial design

I'm waiting on @Mayaleeeee to finalize the design she's working on, cuz afaik based on the design discussion, she's not done with the implementation

  • we really need backend first. This is not simple data that you can just grab with GitHub API. There are dedicated archive projects that we should reuse

@derberg I need more context on what you're talking about here

@Mayaleeeee
Copy link
Member

Hey @AceTheCreator

I'll update the design today and send it for review in the afternoon/evening.

Copy link
Member

derberg commented Nov 28, 2022

@derberg I need more context on what you're talking about here

@AceTheCreator lemme answer with question, what was your plan on getting data for the contributors list. Where did you want to get the list from?

@AceTheCreator
Copy link
Member Author

@derberg I need more context on what you're talking about here

@AceTheCreator lemme answer with question, what was your plan on getting data for the contributors list. Where did you want to get the list from?

I believe with GitHub API

@derberg
Copy link
Member

derberg commented Dec 5, 2022

@AceTheCreator GitHub API is very "expensive" when it comes to such complicated API calls. To list all the contributors, and total of their forks/issues/pr you need to make a lot of API calls. We have over 60 repositories, so it is already 60 individual requests to all repos. Even if done with GraphQL API, the API limit can be affected a lot.

Of course I might be wrong with above 🤷🏼 but I suggest you first check what is possible really.

What I know is that for getting access to such big data set, people use projects like https://www.gharchive.org/

@AceTheCreator
Copy link
Member Author

https://www.gharchive.org/

Thanks for the recommendation. Let see how we can use it

@akshatnema
Copy link
Member

@AceTheCreator GitHub API is very "expensive" when it comes to such complicated API calls.

what if we make simple GitHub API calls under an array of Promises and then resolve it later in the execution? We already know that this computation won't be done at runtime or build-time. Better we can make a GitHub Action to this computation on weekly basis. And requesting for contributors list doesn't require API key as well so we are well and good to make as many API calls as we want. Here's the GET request - https://api.github.com/repos/asyncapi/website/contributors for website repository.

@derberg
Copy link
Member

derberg commented Dec 5, 2022

@akshatnema take into account rate limit, 60req per hour. There are 63 repos, in total above 1000 contributors. Your GitHub action would run for at least 18-20 hours to get all the data.

return (
<GenericLayout
title="AsyncAPI Ambassador Program"
description="The home for developer communities"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
description="The home for developer communities"
description="Mentorship and OSS Community"

I don't like only focusing on saying this is all for "developers." Developers are not the only part of our OSS project, there are many kinds of contributions and contributors.

@akshatnema
Copy link
Member

@akshatnema take into account rate limit, 60req per hour. There are 63 repos, in total above 1000 contributors. Your GitHub action would run for at least 18-20 hours to get all the data.

@derberg But we don't require any API key or account for the requests. What we are doing is just simple GET requests, and there is no limit to that. And we don't even need to make calls for getting contributor details. It already comes up with detailed JSON content in one API call for each repository. If we make an array of Promises to get the data for each repo, it will only take max 10 seconds to run all calls, and we will get data for each repo. But even if we agree on making async/await calls for each repo, it will hardly take 2-3 mins to complete the execution.

Copy link
Member

derberg commented Dec 19, 2022

But we don't require any API key or account for the requests

not sure what you mean, even if you do anonymous call, you have rate limits on IP, these have to be there to prevent DDoS attacks for example.

we have over 60 repos (63), rate limit is 60 requests, and they reset every hour.

  • you need 1 call to get list of repos
  • you need 63 calls to get list of contributors
  • you need a call per contributor (over a 1000 request) to get their details so you can:
    • check their forks
    • check issues they created (not sure it is doable, probably another call per repo and then do user matching)
    • check PRs they created (not sure it is doable, probably another call per repo and then do user matching)

And I did not even included counting of requests increase due to pagination.

optimistically you need 1100 API requests to build a list. If we have 60 requests limit, it is a GitHub Action running for about 18h.

our calculations are very different 😀

@derberg
Copy link
Member

derberg commented Feb 28, 2023

@AceTheCreator please close this PR as it is targeting community branch that we want to close as soon as possible. Also you probably will open new PR once asyncapi/community#593 is sorted, we have designs and clear answer how to collect relevant data.

@AceTheCreator
Copy link
Member Author

@AceTheCreator please close this PR as it is targeting community branch that we want to close as soon as possible. Also you probably will open new PR once asyncapi/community#593 is sorted, we have designs and clear answer how to collect relevant data.

Yeah, you're correct

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants