Skip to content

Commit

Permalink
Plug/ValuePlug : Allow interleaving edit/compute within DirtyPropagat…
Browse files Browse the repository at this point in the history
…ionScopes

This means calling `ValuePlug::dirty()` before the scope closes, so that the hash cache is invalidated before the compute is performed.

This was first requested way back in 2017 in issue #1971, where the use case was edit/compute loops in the PythonEditor. It has become a high priority more recently though, as it turned out we were actually interleaving edits and computes in the TransformTools, and noticed incorrect results while working on the LightTool in #5218. The problem was as follows, and is demonstrated in `testMultipleSelectionWithEditScope` :

1. We push a dirty propagation scope using UndoScope.
2. We create a transform edit for `/cube1`, which dirties some plugs, incrementing their dirty count.
3. We create a transform edit for `/cube`, but as part of doing that we need to compute the _current_ transform. This "bakes" that transform into the (hash) cache, keyed by the current dirty count.
4. We propagate dirtiness for this new transform edit, but don't increment the dirty count again, because we've already visited those plugs in step 2.
5. After closing the scope, we query the transform for `/cube` and end up with the cached value from step 3. This is incorrect, because it isn't accounting for the edits made subsequently.

The problem is demonstrated even more simply by `ComputeNodeTest.testInterleavedEditsAndComputes`.

There is an overhead to calling `flushDirtyPropagationScope()` for each `getValue()` call, and it is measurable in `ValuePlugTest.testCacheOverhead()`. But taking into account the performance improvements made in #5362, we still come out with an %18 reduction in runtime even with the overhead.

Fixes #1971
  • Loading branch information
johnhaddon committed Jun 28, 2023
1 parent 23ac460 commit 4ab1b84
Show file tree
Hide file tree
Showing 6 changed files with 81 additions and 3 deletions.
1 change: 1 addition & 0 deletions Changes.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,7 @@ Fixes
- Expression : Fixed parsing of Python expressions combining subscripts (`[]`) and `context` methods (#3088, #3613, #5250).
- ConnectionCreatorWrapper : Fixed bug which forced PlugAdder derived classes to implement `updateDragEndPoint()` unnecessarily.
- Plug : Fixed bug which caused stale values to be retrieved from the cache for plugs that had been renamed.
- ValuePlug : Fixed results when graph edits and computes are interleaved within a DirtyPropagationScope (#1971).
- OpenColorIOTransform : Fixed error processing deep image tiles containing no samples.
- Seeds :
- Fixed duplicate points at triangle edges.
Expand Down
10 changes: 10 additions & 0 deletions include/Gaffer/Plug.h
Original file line number Diff line number Diff line change
Expand Up @@ -245,6 +245,16 @@ class GAFFER_API Plug : public GraphComponent
/// connected to the signal.
virtual void dirty();

/// Makes any pending calls to `dirty()` that would otherwise be
/// delayed by a DirtyPropagationScope. May be called by plugs
/// that wish to allow edits and computes to be interleaved within
/// a single DirtyPropagationScope, requiring caches to be invalidated
/// before the scope is closed.
///
/// > Note : Never calls `plugDirtiedSignal()` - that is always
/// > deferred until the DirtyPropagationScope closes.
static void flushDirtyPropagationScope();

private :

static void propagateDirtinessAtLeaves( Plug *plugToDirty );
Expand Down
36 changes: 36 additions & 0 deletions python/GafferSceneUITest/TranslateToolTest.py
Original file line number Diff line number Diff line change
Expand Up @@ -1225,5 +1225,41 @@ def testInteractionWithParentConstraints( self ) :
imath.V3f( 1, 2, 3 )
)

def testMultipleSelectionWithEditScope( self ) :

script = Gaffer.ScriptNode()

script["cube1"] = GafferScene.Cube()
script["cube2"] = GafferScene.Cube()
script["cube3"] = GafferScene.Cube()

script["parent"] = GafferScene.Parent()
script["parent"]["parent"].setValue( "/" )
script["parent"]["children"][0].setInput( script["cube1"]["out"] )
script["parent"]["children"][1].setInput( script["cube2"]["out"] )
script["parent"]["children"][2].setInput( script["cube3"]["out"] )

script["editScope"] = Gaffer.EditScope()
script["editScope"].setup( script["parent"]["out"] )
script["editScope"]["in"].setInput( script["parent"]["out"] )

view = GafferSceneUI.SceneView()
view["in"].setInput( script["editScope"]["out"] )
view["editScope"].setInput( script["editScope"]["out"] )

GafferSceneUI.ContextAlgo.setSelectedPaths( view.getContext(), IECore.PathMatcher( [ "/cube", "/cube1", "/cube2" ] ) )
GafferSceneUI.ContextAlgo.setLastSelectedPath( view.getContext(), "/cube" )

tool = GafferSceneUI.TranslateTool( view )
tool["active"].setValue( True )

with Gaffer.UndoScope( script ) :
tool.translate( imath.V3f( 10, 0, 0 ) )

with view.getContext() :
self.assertEqual( script["editScope"]["out"].transform( "/cube" ).translation(), imath.V3f( 10, 0, 0 ) )
self.assertEqual( script["editScope"]["out"].transform( "/cube1" ).translation(), imath.V3f( 10, 0, 0 ) )
self.assertEqual( script["editScope"]["out"].transform( "/cube2" ).translation(), imath.V3f( 10, 0, 0 ) )

if __name__ == "__main__":
unittest.main()
9 changes: 9 additions & 0 deletions python/GafferTest/ComputeNodeTest.py
Original file line number Diff line number Diff line change
Expand Up @@ -740,5 +740,14 @@ def compute( self, plug, context ) :
self.assertNotEqual( h2, h1 )
self.assertEqual( n.plug.getValue(), "out2" )

def testInterleavedEditsAndComputes( self ) :

n = GafferTest.AddNode()

with Gaffer.DirtyPropagationScope() :
for i in range( 0, 100 ) :
n["op1"].setValue( i )
self.assertEqual( n["sum"].getValue(), i )

if __name__ == "__main__":
unittest.main()
25 changes: 23 additions & 2 deletions src/Gaffer/Plug.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -760,6 +760,8 @@ class Plug::DirtyPlugs
return;
}

m_flushPending = true;

if( !insertVertex( plugToDirty ).second )
{
// Previously inserted, so we'll already
Expand Down Expand Up @@ -803,11 +805,25 @@ class Plug::DirtyPlugs
{
if( !m_emitting ) // see comment in emit()
{
flush();
emit();
}
}
}

void flush()
{
if( !m_flushPending )
{
return;
}
m_flushPending = false;
for( const auto &[ plug, vertex ] : m_plugs )
{
plug->dirty();
}
}

static DirtyPlugs &local()
{
static tbb::enumerable_thread_specific<Plug::DirtyPlugs> g_dirtyPlugs;
Expand All @@ -827,7 +843,7 @@ class Plug::DirtyPlugs
using VertexDescriptor = Graph::vertex_descriptor;
using EdgeDescriptor = Graph::edge_descriptor;

using PlugMap = std::unordered_map<const Plug *, VertexDescriptor>;
using PlugMap = std::unordered_map<Plug *, VertexDescriptor>;

// Equivalent to the return type for map::insert - the first
// field is the vertex descriptor, and the second field is
Expand All @@ -854,7 +870,6 @@ class Plug::DirtyPlugs
VertexDescriptor result = add_vertex( m_graph );
m_graph[result] = plug;
m_plugs[plug] = result;
plug->dirty();

// Insert parent plug.
if( auto parent = plug->parent<Plug>() )
Expand Down Expand Up @@ -962,6 +977,7 @@ class Plug::DirtyPlugs
Graph m_graph;
PlugMap m_plugs;
size_t m_scopeCount;
bool m_flushPending;
bool m_emitting;

};
Expand All @@ -982,6 +998,11 @@ void Plug::popDirtyPropagationScope()
DirtyPlugs::local().popScope();
}

void Plug::flushDirtyPropagationScope()
{
DirtyPlugs::local().flush();
}

void Plug::dirty()
{
}
3 changes: 2 additions & 1 deletion src/Gaffer/ValuePlug.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -235,6 +235,8 @@ class ValuePlug::HashProcess : public Process
// one per context, computed by ComputeNode::hash(). Pull the value from our cache, or compute it
// using a HashProcess instance.

Plug::flushDirtyPropagationScope(); // Ensure any pending calls to `dirty()` are made before we look up `m_dirtyCount`.

const ComputeNode *computeNode = IECore::runTimeCast<const ComputeNode>( p->node() );
const ThreadState &threadState = ThreadState::current();
const Context *currentContext = threadState.context();
Expand Down Expand Up @@ -313,7 +315,6 @@ class ValuePlug::HashProcess : public Process
// HashCacheMode::Legacy
HashProcessKey legacyProcessKey( processKey );
legacyProcessKey.dirtyCount = g_legacyGlobalDirtyCount + DIRTY_COUNT_RANGE_MAX + 1;

return threadData.cache.get( legacyProcessKey, currentContext->canceller() );
}
}
Expand Down

0 comments on commit 4ab1b84

Please sign in to comment.