-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Validator_Monitoring_Service proposal #1791
Conversation
Thanks for the application @avtishin, I appreciate you comparing your implementation to the existing solutions, you certainly did a lot of research. A couple of initial questions:
|
Hey @keeganquigley , thank you for your questions. So,
Does that answer your questions? |
Thanks for your thorough answers @avtishin yes that definitely answers my questions! I'll reply here:
That being said, I will go ahead and mark the application as ready for review and ping the committee internally for questions/comments. |
@keeganquigley thank you again very much for your questions. Regarding 2. rules are the rules, we will prepare a code for deploying a tg bot. However, it's not possible to make deployment fully automized because the tg bot relies on personal tg token and github actions (we described it in the proposal, btw), which are needed to configure manually. So, we will prepare an instruction for all of it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the grant application. It looks good to me, and I'm happy to support it. You might want to consider applying for treasury funding in the future to maintain and further improve the tool.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@avtishin thanks for your answers and sorry for the delay. I am also happy to go forward with it. I think this will be very helpful for validators and I'm aware of your team's influence in our validator ecosystem. Can't wait to see the results!
Hey @keeganquigley am I correct that we should receive more more approve? |
That's correct. You need one more approval. I ping the rest of the team again. |
Congratulations and welcome to the Web3 Foundation Grants Program! Please refer to our Milestone Delivery repository for instructions on how to submit milestones and invoices, our FAQ for frequently asked questions and the support section of our README for more ways to find answers to your questions. |
Hi @avtishin how is milestone 1 coming along? |
Hey @keeganquigley w3f/Grant-Milestone-Delivery#971 -- we are working on comments by external evaluator |
Whoops sorry @avtishin didn't realize you had submitted a delivery. Thanks! |
No problem, it's a good experience. You have a exceptional evaluators, so detailed questions and remarks)) We (as a team) highly appreciate it. We are still committed to deliver everything) |
Hi @avtishin how is milestone 2 coming along? |
@keeganquigley hey, planing to publish till the end of the year |
Hi @avtishin still planning to submit M2 soon? |
Yes, we are on it |
Hi @avtishin are you able to provide an update? If you don't think a delivery will be submitted in the next couple of weeks we would ask for an amendment PR to extend the timeline. |
Let's say end of the next week otherwise we will do PR to extend |
@keeganquigley good news, we are going to publish in a couple of days. We are fixing the last bugs. etc |
@keeganquigley w3f/Grant-Milestone-Delivery#1155 here we are! |
Project Abstract
Validator Monitoring Service is a monitoring platform designed to track the performance of validators in Polkadot and Kusama networks. Validator Monitoring Service has several key features that make it an essential tool for validator operators and nominators in Polkadot and Kusama networks.
We provide a live dashboard that users can access through an easy-to-use telegram bot (@p2pvalidator_monitoring_bot). This live dashboard allows users to track essential metrics such as session/era progress, rewards points, consensus participation, position in the active set, and more. We collect data on all active validators and store it for up to one month, providing users with historical data on validator performance. This allows users to analyse their validator's behaviour over time and identify any potential issues.
Grant level
Application Checklist
project_name.md
).@_______:matrix.org
(change the homeserver if you use a different one)