X Tutup
The Wayback Machine - https://web.archive.org/web/20200911235537/https://github.com/cli/cli/pull/1102
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GHE compatibility for `pr list/status` #1102

Open
wants to merge 3 commits into
base: trunk
from
Open

Conversation

@mislav
Copy link
Member

mislav commented Jun 4, 2020

This uses GraphQL introspection to query the API functionality of the server and adjust the query accordingly (i.e. never include fields or filters that are unsupported). This approach allows us to conditionally adopt new or feature-flagged APIs in CLI without breaking compatibility with GitHub instances that don't offer the same APIs.

This PR addresses:

  • pr list
    • provide assignee and author filtering via pullRequests.filterBy feature-flagged API #641
    • cleans up the previous Search-powered hack to query by assignee
    • avoids querying isDraft if it's not available
  • pr status
    • avoids querying isDraft if it's not available
    • omits reviewDecision info if not available
    • omits statusCheckRollup info if not available

My findings and feelings:

  • this works well and was easier to implement than I initially thought
  • a slightly modified approach will need to be taken for queries done via the githubv4 GraphQL library

If we were to go this way, than the remaining API surface to cover wouldn't be large:

  1. we would need to stop passing draft input to pr create unless it's supported
  2. upcoming issue templates work #875 could make use of the conditional approach

To resolve before this can be merged:

  • is the 500–700ms overhead of API introspection acceptable for commands such as pr list/status? should we cache the result locally?
  • the pullRequests.filterBy API isn't yet ready for adoption due to inconsistencies with different filtering combinations

Ref. #823 (comment) #273

mislav added 3 commits Jun 4, 2020
This is for compatibility with older GHE.

This reverts commit 7c11f76.
This makes use of the new `pullRequests.filterBy` API currently scoped
behind the `gh_cli` feature flag.
@mislav mislav requested review from vilmibm and probablycorey Jun 4, 2020
Copy link
Member

vilmibm left a comment

Thanks for exploring this. It's a creative solution to a hard problem and the resulting code could certainly be a lot more complex.

I do think this will be difficult to maintain but no more so than any similar approach; I really appreciate how this frees us from having to actually track what is available in each Enterprise release. I also assume that the minority of our queries will ultimately require the feature detection dance.

I'm 👍 on continuing down this path but do want us to find a way to isolate the feature detection code in helpers to make reading the actual querying functions easier.

a slightly modified approach will need to be taken for queries done via the githubv4 GraphQL library

Do you have an example of what that might look like?

is the 500–700ms overhead of API introspection acceptable for commands such as pr list/status? should we cache the result locally?

Caching makes sense; is there a simpler compromise here of not bothering with feature detection if we notice we're on github.com? Until we have enterprise adoption it seems premature to worry about the extra .5 second.

the pullRequests.filterBy API isn't yet ready for adoption due to inconsistencies with different filtering combinations

The rest of this approach could be merged though, right?

Copy link
Contributor

probablycorey left a comment

is the 500–700ms overhead of API introspection acceptable for commands such as pr list/status? should we cache the result locally?

I'm a little worried about this. It's probably OK, but it is noticeable and annoying. Caching the result would work, but I'm not sure when we would invalidate the cache?

  • Never? This seems like a bad idea, what if the enterprise sever is updated?
  • Manually? Letting people do this manually via a command or config setting works but relies on user's knowing how and when to do it.
  • Periodically? This might work, a 500-700ms delay once a month is not a huge deal
  • Automatically? This would be great, but I'm not sure how unless we can know the enterprise version number automatically

This isn't a gnarly as I thought it would be! I was trying to figure out if there was a way to pull out the feature detection code into a separate function that would be easy to run for other commands. But it looks like each call will be pretty specific?

But maybe pulling out the feature detection code for each function into a new function would make it a little easier to follow? There are a lot of graphl query structs in each function!

jsonFile, _ := os.Open("../test/fixtures/prStatus.json")
defer jsonFile.Close()
http.StubResponse(200, jsonFile)
http.Register(httpmock.GraphQL(`\b__type\b`), httpmock.FileResponse("../test/fixtures/prIntrospection.json"))

This comment has been minimized.

@probablycorey

probablycorey Jun 4, 2020

Contributor

What does that \b do? I'm assuming it's a regex word boundary?

@mislav
Copy link
Member Author

mislav commented Jun 23, 2020

Do you have an example of what that might look like?

@vilmibm No, but that's a good question. I suspect we might need to either: 1) fork the library or 2) generate structs dynamically using Go's reflect.

is there a simpler compromise here of not bothering with feature detection if we notice we're on github.com?

That's not a bad idea, but we have to consider the fact that, before firing off a request, we don't know whether an authentication token belongs to the "GitHub CLI" OAuth app or is a Personal Access Token. PATs don't have access to feature flags, so we still get slightly different versions of the API even while on github.com.

That's the main reason why I would vote for doing feature detection always, but caching it per hashed combo of host+token.

Caching the result would work, but I'm not sure when we would invalidate the cache?

@probablycorey I think caching the results of feature detection queries for 24h would be the sweet-spot! It's just long enough to not feel it persistently during a day, but just short enough to be able to get new features if github.com or one's GHE server got upgraded overnight.

But maybe pulling out the feature detection code for each function into a new function would make it a little easier to follow?

That's a good call! I will clean this up once this PR is unblocked on the API front.

@vilmibm
Copy link
Member

vilmibm commented Aug 25, 2020

@mislav do you still intend to see this one through or can it be closed?

@mislav
Copy link
Member Author

mislav commented Aug 26, 2020

@vilmibm I still intend to see this one through!

@mislav mislav mentioned this pull request Sep 3, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

3 participants
You can’t perform that action at this time.
X Tutup