Skip to content

Commit

Permalink
fix merge conflict post hoc
Browse files Browse the repository at this point in the history
  • Loading branch information
rcoreilly committed May 2, 2024
1 parent 1ef5da0 commit 8f6f24a
Show file tree
Hide file tree
Showing 4 changed files with 10 additions and 12 deletions.
4 changes: 0 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,11 +43,7 @@ See [python README](https://github.com/emer/leabra/blob/master/python/README.md)

# Design / Organization

<<<<<<< HEAD
* The `emergent` repository contains a collection of packages supporting the implementation of biologically based neural networks. The main package is `emer` which specifies a minimal abstract interface for a neural network. The `etable` `etable.Table` data structure (DataTable in C++) is in a separate repository under the overall `emer` project umbrella, as are specific algorithms such as `leabra` which implement the `emer` interface.
=======
* The `emergent` repository contains a collection of packages supporting the implementation of biologically-based neural networks. The main package is `emer` which specifies a minimal abstract interface for a neural network. The `table` `table.Table` data structure (DataTable in C++) is in a separate repository under the overall `emer` project umbrella, as are specific algorithms such as `leabra` which implement the `emer` interface.
>>>>>>> 5874d12 (updated to core v0.1.2 with tensor/table etc)

* Go uses `interfaces` to represent abstract collections of functionality (i.e., sets of methods). The `emer` package provides a set of interfaces for each structural level (e.g., `emer.Layer` etc) -- any given specific layer must implement all of these methods, and the structural containers (e.g., the list of layers in a network) are lists of these interfaces. An interface is implicitly a *pointer* to an actual concrete object that implements the interface.

Expand Down
1 change: 1 addition & 0 deletions egui/plots.go
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ func (gui *GUI) AddPlots(title string, lg *elog.Logs) {

plt := gui.NewPlotTab(key, mode+" "+time+" Plot")
plt.SetTable(lt.Table)
plt.UpdatePlot()
plt.Params.FromMetaMap(lt.Meta)

ConfigPlotFromLog(title, plt, lg, key)
Expand Down
16 changes: 8 additions & 8 deletions elog/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,7 @@ Overall summary performance statistics have multiple Write functions for differe
etime.Scope(etime.AllModes, etime.Trial): func(ctx *elog.Context) {
ctx.SetStatFloat("TrlUnitErr")
}, etime.Scope(etime.AllModes, etime.Epoch): func(ctx *elog.Context) {
ctx.SetAgg(ctx.Mode, etime.Trial, agg.AggMean)
ctx.SetAgg(ctx.Mode, etime.Trial, stats.Mean)
}, etime.Scope(etime.AllModes, etime.Run): func(ctx *elog.Context) {
ix := ctx.LastNRows(ctx.Mode, etime.Epoch, 5)
ctx.SetFloat64(agg.Mean(ix, ctx.Item.Name)[0])
Expand Down Expand Up @@ -251,7 +251,7 @@ Iterate over layers of interest (use `LayersByClass` function). It is *essential
Type: reflect.Float64,
Plot: false,
FixMax: false,
Range: minmax.F64{Max: 1},
Range: minmax.F32{Max: 1},
Write: elog.WriteMap{
etime.Scope(etime.Train, etime.Epoch): func(ctx *elog.Context) {
ly := ctx.Layer(clnm).(axon.AxonLayer).AsAxon()
Expand All @@ -268,13 +268,13 @@ Here's how to log a projection variable:
Name: clnm + "_FF_AvgMaxG",
Type: reflect.Float64,
Plot: false,
Range: minmax.F64{Max: 1},
Range: minmax.F32{Max: 1},
Write: elog.WriteMap{
etime.Scope(etime.Train, etime.Trial): func(ctx *elog.Context) {
ffpj := cly.RecvPrjn(0).(*axon.Prjn)
ctx.SetFloat32(ffpj.GScale.AvgMax)
}, etime.Scope(etime.AllModes, etime.Epoch): func(ctx *elog.Context) {
ctx.SetAgg(ctx.Mode, etime.Trial, agg.AggMean)
ctx.SetAgg(ctx.Mode, etime.Trial, stats.Mean)
}}})
```

Expand All @@ -293,7 +293,7 @@ A log column can be a tensor of any shape -- the `SetLayerTensor` method on the
Type: reflect.Float64,
CellShape: cly.Shape().Shp,
FixMax: true,
Range: minmax.F64{Max: 1},
Range: minmax.F32{Max: 1},
Write: elog.WriteMap{
etime.Scope(etime.Test, etime.Trial): func(ctx *elog.Context) {
ctx.SetLayerTensor(clnm, "Act")
Expand Down Expand Up @@ -326,7 +326,7 @@ Here's how you record the data and log the resulting stats, using the `Analyze`
Type: reflect.Float64,
CellShape: cly.Shape().Shp,
FixMax: true,
Range: minmax.F64{Max: 1},
Range: minmax.F32{Max: 1},
Write: elog.WriteMap{
etime.Scope(etime.Analyze, etime.Trial): func(ctx *elog.Context) {
ctx.SetLayerTensor(clnm, "ActM")
Expand Down Expand Up @@ -357,13 +357,13 @@ This item creates a tensor column that records the average error for each catego
CellShape: []int{20},
DimNames: []string{"Cat"},
Plot: true,
Range: minmax.F64{Min: 0},
Range: minmax.F32{Min: 0},
TensorIndex: -1, // plot all values
Write: elog.WriteMap{
etime.Scope(etime.Test, etime.Epoch): func(ctx *elog.Context) {
ix := ctx.Logs.IndexView(etime.Test, etime.Trial)
spl := split.GroupBy(ix, []string{"Cat"})
split.AggTry(spl, "Err", agg.AggMean)
split.AggTry(spl, "Err", stats.Mean)
cats := spl.AggsToTable(table.ColumnNameOnly)
ss.Logs.MiscTables[ctx.Item.Name] = cats
ctx.SetTensor(cats.Columns[1])
Expand Down
1 change: 1 addition & 0 deletions patgen/configpats.go
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ import (

// InitPats initiates patterns to be used in MixPats
func InitPats(dt *table.Table, name, desc, inputName, outputName string, listSize, ySize, xSize, poolY, poolX int) {
dt.DeleteAll()
dt.SetMetaData("name", name)
dt.SetMetaData("desc", desc)
dt.AddStringColumn("Name")
Expand Down

0 comments on commit 8f6f24a

Please sign in to comment.