X Tutup
Skip to content

POC: Ast Merging and Scoped Cache Invalidation#2980

Draft
dantleech wants to merge 13 commits intomasterfrom
pseudo-incremental-parser
Draft

POC: Ast Merging and Scoped Cache Invalidation#2980
dantleech wants to merge 13 commits intomasterfrom
pseudo-incremental-parser

Conversation

@dantleech
Copy link
Collaborator

@dantleech dantleech commented Dec 6, 2025

Experimenting with merging ASTs to preserve the object IDs of the nodes which might help with far more efficient caching techniques.

The idea would be that the revised AST (e.g. afer user has made a text edit) is merged with the previous one. This means that unchanged nodes would not be replaced and so we are then able to cache by the nodes by their object ID over multiple requests - this is something simulates the behavior of an incremental parser (and the parser is absolutely not the bottleneck currently).

The other part of this would be scoped cache invalidation, so - given that the user changes some text within a scoped area (e.g. a method block) we would only invalidate the nodes within that block and not those in the rest of the document (as we mostly do currently). This would have a dramatic impact on performance in large files.


  • The merging parser seems to work (or at least I haven't succeeded in breaking it in isiolatino)
  • But the LS is becoming corrupted possibly similar to the ssues with Per document cache #2882

@dantleech dantleech marked this pull request as draft December 6, 2025 18:10
@dantleech dantleech force-pushed the pseudo-incremental-parser branch 2 times, most recently from 77510cd to e6c05fa Compare December 6, 2025 18:18
Experimenting with merging ASTs to preserve the object IDs of the
nodes which might help with far more efficient caching techniques.
@dantleech dantleech force-pushed the pseudo-incremental-parser branch from e6c05fa to bc42d76 Compare December 6, 2025 18:18

if ($node2Child instanceof Node) {
// recurse on the listed node:
$this->doMerge($node1Child, $node2Child);
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but node1...

{
$source = $this->fileSource1;
$source->fileContents = TextEdits::one($edit)->apply($source->getFileContents());
self::reindex($this->fileSource1);
Copy link
Collaborator Author

@dantleech dantleech Dec 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note that depending on the number of edits this can probably done far more efficiently by creating a TextEdits collection and applying them all and reindexingt once after updating the AST.


private function copyNode(Node|Token $node): Node|Token
{
return $node;
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here to abstract the place where we could deep_copy the node, but performance wise we're probably happy to corrupt the second AST,

return parent::parseSourceFile($source);
}

if (!isset($this->documents[$uri])) {
Copy link
Collaborator Author

@dantleech dantleech Dec 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should probably, for now, use a Ttl caching mechanism, but should use the #2882

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

X Tutup