-
Notifications
You must be signed in to change notification settings - Fork 213
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add SendBatch API #202
Add SendBatch API #202
Conversation
0a4d9e2
to
71f965d
Compare
Codecov ReportPatch coverage:
❗ Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more. Additional details and impacted files@@ Coverage Diff @@
## master #202 +/- ##
==========================================
+ Coverage 69.54% 70.35% +0.80%
==========================================
Files 27 27
Lines 3596 3714 +118
==========================================
+ Hits 2501 2613 +112
- Misses 979 983 +4
- Partials 116 118 +2
☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Initial pass
08d5895
to
00358c1
Compare
49d2be8
to
a9e6988
Compare
Latest version changes the API of SendBatch and gets rid of the Batch type: New API:
The |
Progress update: The new API feels better than my first attempt. This change is still missing tests. |
32f41a9
to
6ffc0ac
Compare
Added unit tests for SendBatch. I think there are just a couple small changes I want to make. The change is ready for review. |
35f258d
to
90be048
Compare
Change the rpcs chan from holding hrpc.Call to []hrpc.Call. This is required for batch support.
Allows passing a batch of RPCs to the region client, which will be added to the in progress multi and then sent to HBASE. A note here is that it's possible for a multi to grow beyond the size of the configured size. This feels like the right way to go because the user of the Batch API has specifically requested the rpcs be batched together, so splitting the batch across multiple multi's breaks that bit of atomicity.
Removes a type assert in SendBatch.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall, LGTM Aaron
return res, allOK | ||
} | ||
|
||
rpcByClient, ok := c.findClients(ctx, batch, res) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could be interesting to add an histogram to track how many regions are "targeted" by batch of RPC. To be added in following PR maybe.
Added the API:
The res has results in the same order as the RPCs in batch. Each RPCResult contains a result and an error. If the error is non-nil then the RPC failed or was not attempted. If it's nil the batch succeeded. The allOK is false if any of the results contain an error.
closes #68, closes #117