From ecb27cebb0393622684087135fc7ccc0098ea368 Mon Sep 17 00:00:00 2001 From: John Andersen Date: Tue, 5 Sep 2023 19:02:20 +0000 Subject: [PATCH] docs: discussions: Alice Engineering Comms: 2023-09-05T19:02:19+00:00 Signed-off-by: John Andersen --- .../alice_engineering_comms/0000/index.md | 7 + .../0000/reply_0000.md | 23 + .../alice_engineering_comms/0001/index.md | 7 + .../0001/reply_0000.md | 121 ++ .../alice_engineering_comms/0002/index.md | 18 + .../0002/reply_0000.md | 43 + .../0002/reply_0001.md | 121 ++ .../alice_engineering_comms/0003/index.md | 5 + .../alice_engineering_comms/0004/index.md | 3 + .../alice_engineering_comms/0005/index.md | 5 + .../alice_engineering_comms/0006/index.md | 18 + .../0006/reply_0000.md | 29 + .../0006/reply_0001.md | 102 ++ .../alice_engineering_comms/0007/index.md | 18 + .../0007/reply_0000.md | 63 + .../alice_engineering_comms/0008/index.md | 10 + .../0008/reply_0000.md | 351 +++++ .../alice_engineering_comms/0009/index.md | 1 + .../0009/reply_0000.md | 519 ++++++++ .../alice_engineering_comms/0010/index.md | 16 + .../0010/reply_0000.md | 994 +++++++++++++++ .../alice_engineering_comms/0011/index.md | 7 + .../0011/reply_0000.md | 12 + .../alice_engineering_comms/0012/index.md | 17 + .../0012/reply_0000.md | 14 + .../0012/reply_0001.md | 1 + .../alice_engineering_comms/0013/index.md | 32 + .../0013/reply_0000.md | 132 ++ .../alice_engineering_comms/0014/index.md | 1 + .../0014/reply_0000.md | 126 ++ .../0014/reply_0001.md | 1 + .../0014/reply_0002.md | 1 + .../alice_engineering_comms/0015/index.md | 4 + .../0015/reply_0000.md | 110 ++ .../alice_engineering_comms/0016/index.md | 16 + .../0016/reply_0000.md | 71 ++ .../0016/reply_0001.md | 10 + .../alice_engineering_comms/0017/index.md | 27 + .../0017/reply_0000.md | 247 ++++ .../0017/reply_0001.md | 1 + .../alice_engineering_comms/0018/index.md | 2 + .../alice_engineering_comms/0019/index.md | 8 + .../0019/reply_0000.md | 41 + .../0019/reply_0001.md | 3 + .../0019/reply_0002.md | 1 + .../alice_engineering_comms/0020/index.md | 1 + .../0020/reply_0000.md | 16 + .../0020/reply_0001.md | 30 + .../alice_engineering_comms/0021/index.md | 3 + .../0021/reply_0000.md | 33 + .../alice_engineering_comms/0022/index.md | 1 + .../0022/reply_0000.md | 4 + .../0022/reply_0001.md | 14 + .../0022/reply_0002.md | 5 + .../alice_engineering_comms/0023/index.md | 1 + .../0023/reply_0000.md | 2 + .../alice_engineering_comms/0024/index.md | 1 + .../0024/reply_0000.md | 12 + .../alice_engineering_comms/0025/index.md | 14 + .../0025/reply_0000.md | 50 + .../alice_engineering_comms/0026/index.md | 8 + .../0026/reply_0000.md | 21 + .../0026/reply_0001.md | 56 + .../alice_engineering_comms/0027/index.md | 1 + .../0027/reply_0000.md | 39 + .../0027/reply_0001.md | 28 + .../alice_engineering_comms/0028/index.md | 3 + .../alice_engineering_comms/0029/index.md | 1 + .../0029/reply_0000.md | 4 + .../alice_engineering_comms/0030/index.md | 0 .../0030/reply_0000.md | 2 + .../alice_engineering_comms/0031/index.md | 4 + .../0031/reply_0000.md | 25 + .../0031/reply_0001.md | 21 + .../alice_engineering_comms/0032/index.md | 41 + .../0032/reply_0000.md | 2 + .../alice_engineering_comms/0033/index.md | 8 + .../0033/reply_0000.md | 12 + .../alice_engineering_comms/0034/index.md | 22 + .../0034/reply_0000.md | 21 + .../0034/reply_0001.md | 356 ++++++ .../0034/reply_0002.md | 81 ++ .../0034/reply_0003.md | 26 + .../0034/reply_0004.md | 99 ++ .../alice_engineering_comms/0035/index.md | 1 + .../0035/reply_0000.md | 5 + .../0035/reply_0001.md | 643 ++++++++++ .../alice_engineering_comms/0036/index.md | 7 + .../0036/reply_0000.md | 92 ++ .../alice_engineering_comms/0037/index.md | 1 + .../0037/reply_0000.md | 114 ++ .../alice_engineering_comms/0038/index.md | 1 + .../0038/reply_0000.md | 226 ++++ .../0038/reply_0001.md | 15 + .../alice_engineering_comms/0039/index.md | 10 + .../0039/reply_0000.md | 66 + .../0039/reply_0001.md | 45 + .../alice_engineering_comms/0040/index.md | 25 + .../0040/reply_0000.md | 65 + .../0040/reply_0001.md | 3 + .../alice_engineering_comms/0041/index.md | 1 + .../0041/reply_0000.md | 237 ++++ .../0041/reply_0001.md | 123 ++ .../alice_engineering_comms/0042/index.md | 1 + .../0042/reply_0000.md | 66 + .../0042/reply_0001.md | 35 + .../alice_engineering_comms/0043/index.md | 1 + .../0043/reply_0000.md | 6 + .../alice_engineering_comms/0044/index.md | 3 + .../0044/reply_0000.md | 207 +++ .../alice_engineering_comms/0045/index.md | 28 + .../alice_engineering_comms/0046/index.md | 3 + .../0046/reply_0000.md | 28 + .../alice_engineering_comms/0047/index.md | 1 + .../0047/reply_0000.md | 31 + .../alice_engineering_comms/0048/index.md | 1 + .../0048/reply_0000.md | 38 + .../alice_engineering_comms/0049/index.md | 1 + .../0049/reply_0000.md | 30 + .../alice_engineering_comms/0050/index.md | 4 + .../0050/reply_0000.md | 49 + .../0050/reply_0001.md | 397 ++++++ .../alice_engineering_comms/0051/index.md | 1 + .../0051/reply_0000.md | 153 +++ .../0051/reply_0001.md | 37 + .../alice_engineering_comms/0052/index.md | 8 + .../0052/reply_0000.md | 27 + .../alice_engineering_comms/0053/index.md | 1 + .../0053/reply_0000.md | 96 ++ .../0053/reply_0001.md | 13 + .../alice_engineering_comms/0054/index.md | 8 + .../0054/reply_0000.md | 270 ++++ .../0054/reply_0001.md | 23 + .../0054/reply_0002.md | 18 + .../alice_engineering_comms/0055/index.md | 1 + .../0055/reply_0000.md | 14 + .../alice_engineering_comms/0056/index.md | 40 + .../0056/reply_0000.md | 30 + .../0056/reply_0001.md | 14 + .../alice_engineering_comms/0057/index.md | 1 + .../0057/reply_0000.md | 33 + .../alice_engineering_comms/0058/index.md | 20 + .../0058/reply_0000.md | 150 +++ .../alice_engineering_comms/0059/index.md | 1 + .../0059/reply_0000.md | 437 +++++++ .../alice_engineering_comms/0060/index.md | 1 + .../0060/reply_0000.md | 401 ++++++ .../alice_engineering_comms/0061/index.md | 1 + .../0061/reply_0000.md | 17 + .../0061/reply_0001.md | 1 + .../alice_engineering_comms/0062/index.md | 1 + .../0062/reply_0000.md | 273 ++++ .../alice_engineering_comms/0063/index.md | 1 + .../0063/reply_0000.md | 3 + .../alice_engineering_comms/0064/index.md | 1 + .../0064/reply_0000.md | 5 + .../alice_engineering_comms/0065/index.md | 1 + .../0065/reply_0000.md | 41 + .../0065/reply_0001.md | 259 ++++ .../alice_engineering_comms/0066/index.md | 135 ++ .../0066/reply_0000.md | 17 + .../0066/reply_0001.md | 24 + .../alice_engineering_comms/0067/index.md | 9 + .../0067/reply_0000.md | 28 + .../0067/reply_0001.md | 16 + .../alice_engineering_comms/0068/index.md | 38 + .../0068/reply_0000.md | 18 + .../0068/reply_0001.md | 60 + .../alice_engineering_comms/0069/index.md | 1 + .../0069/reply_0000.md | 15 + .../alice_engineering_comms/0070/index.md | 1 + .../0070/reply_0000.md | 12 + .../alice_engineering_comms/0071/index.md | 1 + .../alice_engineering_comms/0072/index.md | 1 + .../0072/reply_0000.md | 8 + .../alice_engineering_comms/0073/index.md | 1 + .../0073/reply_0000.md | 32 + .../alice_engineering_comms/0074/index.md | 1 + .../0074/reply_0000.md | 51 + .../0074/reply_0001.md | 41 + .../alice_engineering_comms/0075/index.md | 1 + .../0075/reply_0000.md | 208 +++ .../alice_engineering_comms/0076/index.md | 1 + .../0076/reply_0000.md | 162 +++ .../alice_engineering_comms/0077/index.md | 1 + .../0077/reply_0000.md | 4 + .../alice_engineering_comms/0078/index.md | 1 + .../0078/reply_0000.md | 41 + .../alice_engineering_comms/0079/index.md | 25 + .../0079/reply_0000.md | 69 + .../alice_engineering_comms/0080/index.md | 1 + .../0080/reply_0000.md | 87 ++ .../0080/reply_0001.md | 128 ++ .../alice_engineering_comms/0081/index.md | 38 + .../0081/reply_0000.md | 44 + .../alice_engineering_comms/0082/index.md | 6 + .../0082/reply_0000.md | 64 + .../0082/reply_0001.md | 13 + .../0082/reply_0002.md | 87 ++ .../alice_engineering_comms/0083/index.md | 1 + .../0083/reply_0000.md | 121 ++ .../alice_engineering_comms/0084/index.md | 1 + .../0084/reply_0000.md | 39 + .../0084/reply_0001.md | 20 + .../alice_engineering_comms/0085/index.md | 1 + .../0085/reply_0000.md | 20 + .../alice_engineering_comms/0086/index.md | 1 + .../0086/reply_0000.md | 340 +++++ .../0086/reply_0001.md | 148 +++ .../0086/reply_0002.md | 1130 +++++++++++++++++ .../alice_engineering_comms/0087/index.md | 4 + .../0087/reply_0000.md | 254 ++++ .../alice_engineering_comms/0088/index.md | 1 + .../0088/reply_0000.md | 136 ++ .../0088/reply_0001.md | 22 + .../alice_engineering_comms/0089/index.md | 1 + .../0089/reply_0000.md | 384 ++++++ .../alice_engineering_comms/0090/index.md | 1 + .../0090/reply_0000.md | 274 ++++ .../0090/reply_0001.md | 19 + .../alice_engineering_comms/0091/index.md | 1 + .../0091/reply_0000.md | 5 + .../alice_engineering_comms/0092/index.md | 1 + .../alice_engineering_comms/0093/index.md | 1 + .../0093/reply_0000.md | 93 ++ .../alice_engineering_comms/0094/index.md | 1 + .../0094/reply_0000.md | 91 ++ .../alice_engineering_comms/0095/index.md | 1 + .../0095/reply_0000.md | 94 ++ .../alice_engineering_comms/0096/index.md | 1 + .../alice_engineering_comms/0097/index.md | 1 + .../0097/reply_0000.md | 7 + .../alice_engineering_comms/0098/index.md | 1 + .../alice_engineering_comms/0099/index.md | 1 + .../0099/reply_0000.md | 24 + .../0099/reply_0001.md | 23 + .../alice_engineering_comms/0100/index.md | 5 + .../0100/reply_0000.md | 92 ++ .../alice_engineering_comms/0101/index.md | 1 + .../0101/reply_0000.md | 176 +++ .../alice_engineering_comms/0102/index.md | 1 + .../0102/reply_0000.md | 225 ++++ .../0102/reply_0001.md | 1 + .../alice_engineering_comms/0103/index.md | 1 + .../0103/reply_0000.md | 6 + .../alice_engineering_comms/0104/index.md | 1 + .../0104/reply_0000.md | 16 + .../alice_engineering_comms/0105/index.md | 1 + .../0105/reply_0000.md | 7 + .../alice_engineering_comms/0106/index.md | 1 + .../alice_engineering_comms/0107/index.md | 1 + .../0107/reply_0000.md | 90 ++ .../alice_engineering_comms/0108/index.md | 1 + .../0108/reply_0000.md | 21 + .../alice_engineering_comms/0109/index.md | 1 + .../0109/reply_0000.md | 55 + .../alice_engineering_comms/0110/index.md | 5 + .../0110/reply_0000.md | 39 + .../alice_engineering_comms/0111/index.md | 1 + .../0111/reply_0000.md | 43 + .../alice_engineering_comms/0112/index.md | 1 + .../0112/reply_0000.md | 23 + .../alice_engineering_comms/0113/index.md | 1 + .../0113/reply_0000.md | 2 + .../alice_engineering_comms/0114/index.md | 1 + .../alice_engineering_comms/0115/index.md | 1 + .../alice_engineering_comms/0116/index.md | 1 + .../alice_engineering_comms/0117/index.md | 1 + .../0117/reply_0000.md | 7 + .../alice_engineering_comms/0118/index.md | 1 + .../0118/reply_0000.md | 5 + .../alice_engineering_comms/0119/index.md | 1 + .../0119/reply_0000.md | 18 + .../alice_engineering_comms/0120/index.md | 1 + .../0120/reply_0000.md | 22 + .../alice_engineering_comms/0121/index.md | 1 + .../0121/reply_0000.md | 42 + .../alice_engineering_comms/0122/index.md | 1 + .../0122/reply_0000.md | 11 + .../alice_engineering_comms/0123/index.md | 7 + .../0123/reply_0000.md | 26 + .../alice_engineering_comms/0124/index.md | 1 + .../0124/reply_0000.md | 6 + .../alice_engineering_comms/0125/index.md | 1 + .../0125/reply_0000.md | 2 + .../alice_engineering_comms/0126/index.md | 1 + .../alice_engineering_comms/0127/index.md | 1 + .../0127/reply_0000.md | 17 + .../alice_engineering_comms/0128/index.md | 1 + .../0128/reply_0000.md | 4 + .../alice_engineering_comms/0129/index.md | 1 + .../0129/reply_0000.md | 3 + .../alice_engineering_comms/0130/index.md | 1 + .../0130/reply_0000.md | 2 + .../alice_engineering_comms/0131/index.md | 1 + .../0131/reply_0000.md | 4 + .../alice_engineering_comms/0132/index.md | 1 + .../alice_engineering_comms/0133/index.md | 1 + .../0133/reply_0000.md | 2 + .../alice_engineering_comms/0134/index.md | 1 + .../0134/reply_0000.md | 4 + .../alice_engineering_comms/0135/index.md | 1 + .../alice_engineering_comms/0136/index.md | 1 + .../0136/reply_0000.md | 22 + .../alice_engineering_comms/0137/index.md | 1 + .../0137/reply_0000.md | 28 + .../alice_engineering_comms/0138/index.md | 1 + .../0138/reply_0000.md | 22 + .../0138/reply_0001.md | 20 + .../0138/reply_0002.md | 13 + .../alice_engineering_comms/0139/index.md | 1 + .../0139/reply_0000.md | 17 + .../alice_engineering_comms/0140/index.md | 1 + .../0140/reply_0000.md | 29 + .../alice_engineering_comms/0141/index.md | 1 + .../0141/reply_0000.md | 37 + .../alice_engineering_comms/0142/index.md | 1 + .../0142/reply_0000.md | 7 + .../alice_engineering_comms/0143/index.md | 3 + .../0143/reply_0000.md | 21 + .../alice_engineering_comms/0144/index.md | 4 + .../0144/reply_0000.md | 16 + .../0144/reply_0001.md | 1 + .../alice_engineering_comms/0145/index.md | 1 + .../0145/reply_0000.md | 75 ++ .../alice_engineering_comms/0146/index.md | 12 + .../0146/reply_0000.md | 29 + .../alice_engineering_comms/0147/index.md | 1 + .../0147/reply_0000.md | 21 + .../alice_engineering_comms/0148/index.md | 1 + .../0148/reply_0000.md | 2 + .../alice_engineering_comms/0149/index.md | 1 + .../0149/reply_0000.md | 3 + .../alice_engineering_comms/0150/index.md | 4 + .../0150/reply_0000.md | 49 + .../alice_engineering_comms/0151/index.md | 1 + .../0151/reply_0000.md | 180 +++ .../alice_engineering_comms/0152/index.md | 1 + .../0152/reply_0000.md | 155 +++ .../alice_engineering_comms/0153/index.md | 3 + .../0153/reply_0000.md | 51 + .../alice_engineering_comms/0154/index.md | 1 + .../0154/reply_0000.md | 68 + .../alice_engineering_comms/0155/index.md | 1 + .../0155/reply_0000.md | 48 + .../alice_engineering_comms/0156/index.md | 3 + .../0156/reply_0000.md | 60 + .../0156/reply_0001.md | 21 + .../alice_engineering_comms/0157/index.md | 1 + .../0157/reply_0000.md | 58 + .../alice_engineering_comms/0158/index.md | 1 + .../0158/reply_0000.md | 23 + .../0158/reply_0001.md | 23 + .../alice_engineering_comms/0159/index.md | 1 + .../0159/reply_0000.md | 528 ++++++++ .../alice_engineering_comms/0160/index.md | 1 + .../0160/reply_0000.md | 803 ++++++++++++ .../alice_engineering_comms/0161/index.md | 3 + .../0161/reply_0000.md | 17 + .../alice_engineering_comms/0162/index.md | 1 + .../0162/reply_0000.md | 10 + .../alice_engineering_comms/0163/index.md | 1 + .../0163/reply_0000.md | 540 ++++++++ .../0163/reply_0001.md | 7 + .../alice_engineering_comms/0164/index.md | 5 + .../0164/reply_0000.md | 44 + .../alice_engineering_comms/0165/index.md | 1 + .../0165/reply_0000.md | 31 + .../alice_engineering_comms/0166/index.md | 1 + .../0166/reply_0000.md | 31 + .../0166/reply_0001.md | 58 + .../alice_engineering_comms/0167/index.md | 1 + .../0167/reply_0000.md | 594 +++++++++ .../alice_engineering_comms/0168/index.md | 1 + .../0168/reply_0000.md | 21 + .../alice_engineering_comms/0169/index.md | 1 + .../alice_engineering_comms/0170/index.md | 1 + .../0170/reply_0000.md | 89 ++ .../0170/reply_0001.md | 3 + .../0170/reply_0002.md | 51 + .../alice_engineering_comms/0171/index.md | 1 + .../0171/reply_0000.md | 17 + .../alice_engineering_comms/0172/index.md | 6 + .../0172/reply_0000.md | 46 + .../alice_engineering_comms/0173/index.md | 1 + .../0173/reply_0000.md | 305 +++++ .../alice_engineering_comms/0174/index.md | 1 + .../0174/reply_0000.md | 67 + .../0174/reply_0001.md | 1 + .../alice_engineering_comms/0175/index.md | 1 + .../alice_engineering_comms/0176/index.md | 1 + .../alice_engineering_comms/0177/index.md | 6 + .../0177/reply_0000.md | 286 +++++ .../0177/reply_0001.md | 43 + .../alice_engineering_comms/0178/index.md | 1 + .../0178/reply_0000.md | 2 + .../alice_engineering_comms/0179/index.md | 1 + .../0179/reply_0000.md | 114 ++ .../0179/reply_0001.md | 54 + .../alice_engineering_comms/0180/index.md | 1 + .../0180/reply_0000.md | 5 + .../alice_engineering_comms/0181/index.md | 1 + .../0181/reply_0000.md | 86 ++ .../alice_engineering_comms/0182/index.md | 1 + .../0182/reply_0000.md | 10 + .../alice_engineering_comms/0183/index.md | 1 + .../0183/reply_0000.md | 1 + .../alice_engineering_comms/0184/index.md | 1 + .../alice_engineering_comms/0185/index.md | 1 + .../alice_engineering_comms/0186/index.md | 5 + .../0186/reply_0000.md | 24 + .../0186/reply_0001.md | 39 + .../alice_engineering_comms/0187/index.md | 1 + .../0187/reply_0000.md | 37 + .../alice_engineering_comms/0188/index.md | 1 + .../0188/reply_0000.md | 62 + .../alice_engineering_comms/0189/index.md | 1 + .../0189/reply_0000.md | 226 ++++ .../alice_engineering_comms/0190/index.md | 1 + .../0190/reply_0000.md | 8 + .../alice_engineering_comms/0191/index.md | 1 + .../0191/reply_0000.md | 73 ++ .../0191/reply_0001.md | 45 + .../alice_engineering_comms/0192/index.md | 1 + .../0192/reply_0000.md | 57 + .../0192/reply_0001.md | 55 + .../alice_engineering_comms/0193/index.md | 10 + .../0193/reply_0000.md | 29 + .../0193/reply_0001.md | 27 + .../alice_engineering_comms/0194/index.md | 11 + .../0194/reply_0000.md | 311 +++++ .../0194/reply_0001.md | 54 + .../alice_engineering_comms/0195/index.md | 1 + .../0195/reply_0000.md | 139 ++ .../alice_engineering_comms/0196/index.md | 1 + .../alice_engineering_comms/0197/index.md | 1 + .../0197/reply_0000.md | 116 ++ .../alice_engineering_comms/0198/index.md | 1 + .../0198/reply_0000.md | 63 + .../alice_engineering_comms/0199/index.md | 1 + .../0199/reply_0000.md | 172 +++ .../alice_engineering_comms/0200/index.md | 1 + .../0200/reply_0000.md | 7 + .../alice_engineering_comms/0201/index.md | 1 + .../0201/reply_0000.md | 12 + .../alice_engineering_comms/0202/index.md | 1 + .../0202/reply_0000.md | 18 + .../alice_engineering_comms/0203/index.md | 1 + .../0203/reply_0000.md | 6 + .../alice_engineering_comms/0204/index.md | 1 + .../alice_engineering_comms/0205/index.md | 1 + .../0205/reply_0000.md | 1 + .../alice_engineering_comms/0206/index.md | 1 + .../0206/reply_0000.md | 2 + .../alice_engineering_comms/0207/index.md | 25 + .../0207/reply_0000.md | 29 + .../alice_engineering_comms/0208/index.md | 1 + .../0208/reply_0000.md | 40 + .../alice_engineering_comms/0209/index.md | 1 + .../0209/reply_0000.md | 26 + .../0209/reply_0001.md | 7 + .../alice_engineering_comms/0210/index.md | 1 + .../0210/reply_0000.md | 3 + .../alice_engineering_comms/0211/index.md | 1 + .../0211/reply_0000.md | 2 + .../alice_engineering_comms/0212/index.md | 1 + .../alice_engineering_comms/0213/index.md | 1 + .../0213/reply_0000.md | 52 + .../alice_engineering_comms/0214/index.md | 9 + .../0214/reply_0000.md | 32 + .../0214/reply_0001.md | 90 ++ .../0214/reply_0002.md | 62 + .../alice_engineering_comms/0215/index.md | 3 + .../0215/reply_0000.md | 52 + .../0215/reply_0001.md | 4 + .../0215/reply_0002.md | 78 ++ .../alice_engineering_comms/0216/index.md | 1 + .../0216/reply_0000.md | 27 + .../alice_engineering_comms/0217/index.md | 1 + .../0217/reply_0000.md | 470 +++++++ .../alice_engineering_comms/0218/index.md | 1 + .../0218/reply_0000.md | 93 ++ .../alice_engineering_comms/0219/index.md | 1 + .../0219/reply_0000.md | 21 + .../alice_engineering_comms/0220/index.md | 1 + .../0220/reply_0000.md | 304 +++++ .../0220/reply_0001.md | 7 + .../alice_engineering_comms/0221/index.md | 1 + .../0221/reply_0000.md | 48 + .../alice_engineering_comms/0222/index.md | 1 + .../0222/reply_0000.md | 372 ++++++ .../alice_engineering_comms/0223/index.md | 1 + .../0223/reply_0000.md | 388 ++++++ .../alice_engineering_comms/0224/index.md | 1 + .../0224/reply_0000.md | 217 ++++ .../alice_engineering_comms/0225/index.md | 1 + .../0225/reply_0000.md | 76 ++ .../alice_engineering_comms/0226/index.md | 1 + .../0226/reply_0000.md | 10 + .../alice_engineering_comms/0227/index.md | 1 + .../0227/reply_0000.md | 32 + .../alice_engineering_comms/0228/index.md | 1 + .../0228/reply_0000.md | 6 + .../alice_engineering_comms/0229/index.md | 1 + .../0229/reply_0000.md | 2 + .../alice_engineering_comms/0230/index.md | 1 + .../0230/reply_0000.md | 7 + .../alice_engineering_comms/0231/index.md | 3 + .../0231/reply_0000.md | 18 + .../alice_engineering_comms/0232/index.md | 1 + .../alice_engineering_comms/0233/index.md | 1 + .../alice_engineering_comms/0234/index.md | 1 + .../alice_engineering_comms/0235/index.md | 1 + .../alice_engineering_comms/0236/index.md | 1 + .../0236/reply_0000.md | 5 + .../alice_engineering_comms/0237/index.md | 1 + .../alice_engineering_comms/0238/index.md | 1 + .../0238/reply_0000.md | 1 + .../alice_engineering_comms/0239/index.md | 1 + .../0239/reply_0000.md | 2 + .../alice_engineering_comms/0240/index.md | 1 + .../0240/reply_0000.md | 2 + .../alice_engineering_comms/0241/index.md | 1 + .../0241/reply_0000.md | 1 + .../alice_engineering_comms/0242/index.md | 1 + .../0242/reply_0000.md | 21 + .../alice_engineering_comms/0243/index.md | 1 + .../0243/reply_0000.md | 247 ++++ .../alice_engineering_comms/0244/index.md | 1 + .../0244/reply_0000.md | 26 + .../alice_engineering_comms/0245/index.md | 1 + .../0245/reply_0000.md | 28 + .../alice_engineering_comms/0246/index.md | 1 + .../0246/reply_0000.md | 4 + .../alice_engineering_comms/0247/index.md | 1 + .../0247/reply_0000.md | 10 + .../alice_engineering_comms/0248/index.md | 1 + .../0248/reply_0000.md | 3 + .../alice_engineering_comms/0249/index.md | 1 + .../0249/reply_0000.md | 6 + .../alice_engineering_comms/0250/index.md | 1 + .../0250/reply_0000.md | 42 + .../alice_engineering_comms/0251/index.md | 1 + .../0251/reply_0000.md | 19 + .../alice_engineering_comms/0252/index.md | 1 + .../0252/reply_0000.md | 86 ++ .../alice_engineering_comms/0253/index.md | 1 + .../alice_engineering_comms/0254/index.md | 1 + .../alice_engineering_comms/0255/index.md | 1 + .../0255/reply_0000.md | 20 + .../alice_engineering_comms/0256/index.md | 1 + .../0256/reply_0000.md | 27 + .../alice_engineering_comms/0257/index.md | 1 + .../0257/reply_0000.md | 15 + .../alice_engineering_comms/0258/index.md | 1 + .../0258/reply_0000.md | 27 + .../alice_engineering_comms/0259/index.md | 1 + .../0259/reply_0000.md | 15 + .../alice_engineering_comms/0260/index.md | 1 + .../alice_engineering_comms/0261/index.md | 1 + .../alice_engineering_comms/0262/index.md | 1 + .../0262/reply_0000.md | 3 + .../alice_engineering_comms/0263/index.md | 1 + .../0263/reply_0000.md | 37 + .../alice_engineering_comms/0264/index.md | 1 + .../0264/reply_0000.md | 2 + .../alice_engineering_comms/0265/index.md | 1 + .../0265/reply_0000.md | 2 + .../alice_engineering_comms/0266/index.md | 1 + .../0266/reply_0000.md | 1 + .../alice_engineering_comms/0267/index.md | 1 + .../alice_engineering_comms/0268/index.md | 3 + .../0268/reply_0000.md | 3 + .../alice_engineering_comms/0269/index.md | 1 + .../0269/reply_0000.md | 9 + .../alice_engineering_comms/0270/index.md | 1 + .../0270/reply_0000.md | 57 + .../0270/reply_0001.md | 10 + .../0270/reply_0002.md | 46 + .../alice_engineering_comms/0271/index.md | 1 + .../0271/reply_0000.md | 3 + .../alice_engineering_comms/0272/index.md | 1 + .../0272/reply_0000.md | 9 + .../alice_engineering_comms/0273/index.md | 1 + .../alice_engineering_comms/0274/index.md | 1 + .../alice_engineering_comms/0275/index.md | 1 + .../alice_engineering_comms/0276/index.md | 1 + .../alice_engineering_comms/0277/index.md | 1 + .../alice_engineering_comms/0278/index.md | 1 + .../alice_engineering_comms/0279/index.md | 1 + .../0279/reply_0000.md | 31 + .../alice_engineering_comms/0280/index.md | 1 + .../0280/reply_0000.md | 13 + .../alice_engineering_comms/0281/index.md | 1 + .../alice_engineering_comms/0282/index.md | 1 + .../alice_engineering_comms/0283/index.md | 1 + .../0283/reply_0000.md | 2 + .../alice_engineering_comms/0284/index.md | 1 + .../0284/reply_0000.md | 12 + .../alice_engineering_comms/0285/index.md | 1 + .../alice_engineering_comms/0286/index.md | 1 + .../alice_engineering_comms/0287/index.md | 1 + .../alice_engineering_comms/0288/index.md | 1 + .../alice_engineering_comms/0289/index.md | 1 + .../0289/reply_0000.md | 11 + .../alice_engineering_comms/0290/index.md | 1 + .../0290/reply_0000.md | 16 + .../alice_engineering_comms/0291/index.md | 1 + .../0291/reply_0000.md | 6 + .../alice_engineering_comms/0292/index.md | 1 + .../0292/reply_0000.md | 4 + .../alice_engineering_comms/0293/index.md | 1 + .../alice_engineering_comms/0294/index.md | 1 + .../alice_engineering_comms/0295/index.md | 1 + .../alice_engineering_comms/0296/index.md | 1 + .../alice_engineering_comms/0297/index.md | 1 + .../0297/reply_0000.md | 13 + .../alice_engineering_comms/0298/index.md | 1 + .../0298/reply_0000.md | 12 + .../alice_engineering_comms/0299/index.md | 1 + .../0299/reply_0000.md | 38 + .../0299/reply_0001.md | 6 + .../alice_engineering_comms/0300/index.md | 1 + .../0300/reply_0000.md | 13 + .../alice_engineering_comms/0301/index.md | 1 + .../alice_engineering_comms/0302/index.md | 1 + .../alice_engineering_comms/0303/index.md | 1 + .../alice_engineering_comms/0304/index.md | 1 + .../alice_engineering_comms/0305/index.md | 1 + .../alice_engineering_comms/0306/index.md | 1 + .../alice_engineering_comms/0307/index.md | 1 + .../alice_engineering_comms/0308/index.md | 1 + .../alice_engineering_comms/0309/index.md | 1 + .../alice_engineering_comms/0310/index.md | 1 + .../alice_engineering_comms/0311/index.md | 1 + .../alice_engineering_comms/0312/index.md | 1 + .../alice_engineering_comms/0313/index.md | 1 + .../0313/reply_0000.md | 3 + .../alice_engineering_comms/0314/index.md | 1 + .../0314/reply_0000.md | 73 ++ .../alice_engineering_comms/0315/index.md | 1 + .../alice_engineering_comms/0316/index.md | 1 + .../alice_engineering_comms/0317/index.md | 1 + .../alice_engineering_comms/0318/index.md | 1 + .../alice_engineering_comms/0319/index.md | 1 + .../0319/reply_0000.md | 5 + .../alice_engineering_comms/0320/index.md | 1 + .../alice_engineering_comms/0321/index.md | 1 + .../alice_engineering_comms/0322/index.md | 1 + .../0322/reply_0000.md | 2 + .../alice_engineering_comms/0323/index.md | 1 + .../alice_engineering_comms/0324/index.md | 1 + .../alice_engineering_comms/0325/index.md | 1 + .../alice_engineering_comms/0326/index.md | 1 + .../alice_engineering_comms/0327/index.md | 1 + .../alice_engineering_comms/0328/index.md | 1 + .../0328/reply_0000.md | 2 + .../alice_engineering_comms/0329/index.md | 1 + .../alice_engineering_comms/0330/index.md | 1 + .../alice_engineering_comms/0331/index.md | 1 + .../alice_engineering_comms/0332/index.md | 1 + .../alice_engineering_comms/0333/index.md | 1 + .../alice_engineering_comms/0334/index.md | 1 + .../alice_engineering_comms/0335/index.md | 1 + .../alice_engineering_comms/0336/index.md | 1 + .../alice_engineering_comms/0337/index.md | 1 + .../alice_engineering_comms/0338/index.md | 1 + .../alice_engineering_comms/0339/index.md | 1 + .../alice_engineering_comms/0340/index.md | 1 + .../alice_engineering_comms/0341/index.md | 1 + .../0341/reply_0000.md | 1 + .../alice_engineering_comms/0342/index.md | 1 + .../alice_engineering_comms/0343/index.md | 1 + .../alice_engineering_comms/0344/index.md | 1 + .../alice_engineering_comms/0345/index.md | 1 + .../alice_engineering_comms/0346/index.md | 1 + .../0346/reply_0000.md | 1 + .../alice_engineering_comms/0347/index.md | 1 + .../0347/reply_0000.md | 127 ++ .../alice_engineering_comms/0348/index.md | 1 + .../0348/reply_0000.md | 60 + .../alice_engineering_comms/0349/index.md | 1 + .../0349/reply_0000.md | 5 + .../alice_engineering_comms/0350/index.md | 1 + .../alice_engineering_comms/0351/index.md | 1 + .../0351/reply_0000.md | 1 + .../alice_engineering_comms/0352/index.md | 1 + .../alice_engineering_comms/0353/index.md | 1 + .../alice_engineering_comms/0354/index.md | 1 + .../alice_engineering_comms/0355/index.md | 1 + .../alice_engineering_comms/0356/index.md | 1 + .../alice_engineering_comms/0357/index.md | 1 + .../alice_engineering_comms/0358/index.md | 1 + .../alice_engineering_comms/0359/index.md | 1 + .../alice_engineering_comms/0360/index.md | 1 + .../alice_engineering_comms/0361/index.md | 1 + .../alice_engineering_comms/0362/index.md | 1 + .../alice_engineering_comms/0363/index.md | 1 + .../alice_engineering_comms/0364/index.md | 1 + .../alice_engineering_comms/0365/index.md | 1 + .../alice_engineering_comms/0366/index.md | 1 + .../alice_engineering_comms/0367/index.md | 1 + .../alice_engineering_comms/0368/index.md | 1 + .../alice_engineering_comms/0369/index.md | 1 + .../alice_engineering_comms/0370/index.md | 1 + .../0370/reply_0000.md | 4 + .../alice_engineering_comms/0371/index.md | 1 + .../alice_engineering_comms/0372/index.md | 1 + .../alice_engineering_comms/0373/index.md | 1 + .../alice_engineering_comms/0374/index.md | 1 + .../0374/reply_0000.md | 1 + .../alice_engineering_comms/0375/index.md | 1 + .../alice_engineering_comms/0376/index.md | 1 + .../alice_engineering_comms/0377/index.md | 1 + .../alice_engineering_comms/0378/index.md | 1 + .../alice_engineering_comms/0379/index.md | 1 + .../alice_engineering_comms/0380/index.md | 1 + .../alice_engineering_comms/0381/index.md | 1 + .../0381/reply_0000.md | 128 ++ .../alice_engineering_comms/index.md | 104 ++ 720 files changed, 26165 insertions(+) create mode 100644 docs/discussions/alice_engineering_comms/0000/index.md create mode 100644 docs/discussions/alice_engineering_comms/0000/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0001/index.md create mode 100644 docs/discussions/alice_engineering_comms/0001/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0002/index.md create mode 100644 docs/discussions/alice_engineering_comms/0002/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0002/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0003/index.md create mode 100644 docs/discussions/alice_engineering_comms/0004/index.md create mode 100644 docs/discussions/alice_engineering_comms/0005/index.md create mode 100644 docs/discussions/alice_engineering_comms/0006/index.md create mode 100644 docs/discussions/alice_engineering_comms/0006/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0006/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0007/index.md create mode 100644 docs/discussions/alice_engineering_comms/0007/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0008/index.md create mode 100644 docs/discussions/alice_engineering_comms/0008/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0009/index.md create mode 100644 docs/discussions/alice_engineering_comms/0009/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0010/index.md create mode 100644 docs/discussions/alice_engineering_comms/0010/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0011/index.md create mode 100644 docs/discussions/alice_engineering_comms/0011/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0012/index.md create mode 100644 docs/discussions/alice_engineering_comms/0012/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0012/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0013/index.md create mode 100644 docs/discussions/alice_engineering_comms/0013/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0014/index.md create mode 100644 docs/discussions/alice_engineering_comms/0014/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0014/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0014/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0015/index.md create mode 100644 docs/discussions/alice_engineering_comms/0015/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0016/index.md create mode 100644 docs/discussions/alice_engineering_comms/0016/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0016/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0017/index.md create mode 100644 docs/discussions/alice_engineering_comms/0017/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0017/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0018/index.md create mode 100644 docs/discussions/alice_engineering_comms/0019/index.md create mode 100644 docs/discussions/alice_engineering_comms/0019/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0019/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0019/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0020/index.md create mode 100644 docs/discussions/alice_engineering_comms/0020/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0020/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0021/index.md create mode 100644 docs/discussions/alice_engineering_comms/0021/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0022/index.md create mode 100644 docs/discussions/alice_engineering_comms/0022/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0022/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0022/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0023/index.md create mode 100644 docs/discussions/alice_engineering_comms/0023/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0024/index.md create mode 100644 docs/discussions/alice_engineering_comms/0024/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0025/index.md create mode 100644 docs/discussions/alice_engineering_comms/0025/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0026/index.md create mode 100644 docs/discussions/alice_engineering_comms/0026/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0026/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0027/index.md create mode 100644 docs/discussions/alice_engineering_comms/0027/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0027/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0028/index.md create mode 100644 docs/discussions/alice_engineering_comms/0029/index.md create mode 100644 docs/discussions/alice_engineering_comms/0029/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0030/index.md create mode 100644 docs/discussions/alice_engineering_comms/0030/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0031/index.md create mode 100644 docs/discussions/alice_engineering_comms/0031/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0031/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0032/index.md create mode 100644 docs/discussions/alice_engineering_comms/0032/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0033/index.md create mode 100644 docs/discussions/alice_engineering_comms/0033/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0034/index.md create mode 100644 docs/discussions/alice_engineering_comms/0034/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0034/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0034/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0034/reply_0003.md create mode 100644 docs/discussions/alice_engineering_comms/0034/reply_0004.md create mode 100644 docs/discussions/alice_engineering_comms/0035/index.md create mode 100644 docs/discussions/alice_engineering_comms/0035/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0035/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0036/index.md create mode 100644 docs/discussions/alice_engineering_comms/0036/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0037/index.md create mode 100644 docs/discussions/alice_engineering_comms/0037/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0038/index.md create mode 100644 docs/discussions/alice_engineering_comms/0038/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0038/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0039/index.md create mode 100644 docs/discussions/alice_engineering_comms/0039/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0039/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0040/index.md create mode 100644 docs/discussions/alice_engineering_comms/0040/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0040/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0041/index.md create mode 100644 docs/discussions/alice_engineering_comms/0041/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0041/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0042/index.md create mode 100644 docs/discussions/alice_engineering_comms/0042/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0042/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0043/index.md create mode 100644 docs/discussions/alice_engineering_comms/0043/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0044/index.md create mode 100644 docs/discussions/alice_engineering_comms/0044/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0045/index.md create mode 100644 docs/discussions/alice_engineering_comms/0046/index.md create mode 100644 docs/discussions/alice_engineering_comms/0046/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0047/index.md create mode 100644 docs/discussions/alice_engineering_comms/0047/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0048/index.md create mode 100644 docs/discussions/alice_engineering_comms/0048/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0049/index.md create mode 100644 docs/discussions/alice_engineering_comms/0049/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0050/index.md create mode 100644 docs/discussions/alice_engineering_comms/0050/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0050/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0051/index.md create mode 100644 docs/discussions/alice_engineering_comms/0051/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0051/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0052/index.md create mode 100644 docs/discussions/alice_engineering_comms/0052/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0053/index.md create mode 100644 docs/discussions/alice_engineering_comms/0053/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0053/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0054/index.md create mode 100644 docs/discussions/alice_engineering_comms/0054/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0054/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0054/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0055/index.md create mode 100644 docs/discussions/alice_engineering_comms/0055/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0056/index.md create mode 100644 docs/discussions/alice_engineering_comms/0056/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0056/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0057/index.md create mode 100644 docs/discussions/alice_engineering_comms/0057/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0058/index.md create mode 100644 docs/discussions/alice_engineering_comms/0058/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0059/index.md create mode 100644 docs/discussions/alice_engineering_comms/0059/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0060/index.md create mode 100644 docs/discussions/alice_engineering_comms/0060/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0061/index.md create mode 100644 docs/discussions/alice_engineering_comms/0061/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0061/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0062/index.md create mode 100644 docs/discussions/alice_engineering_comms/0062/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0063/index.md create mode 100644 docs/discussions/alice_engineering_comms/0063/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0064/index.md create mode 100644 docs/discussions/alice_engineering_comms/0064/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0065/index.md create mode 100644 docs/discussions/alice_engineering_comms/0065/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0065/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0066/index.md create mode 100644 docs/discussions/alice_engineering_comms/0066/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0066/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0067/index.md create mode 100644 docs/discussions/alice_engineering_comms/0067/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0067/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0068/index.md create mode 100644 docs/discussions/alice_engineering_comms/0068/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0068/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0069/index.md create mode 100644 docs/discussions/alice_engineering_comms/0069/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0070/index.md create mode 100644 docs/discussions/alice_engineering_comms/0070/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0071/index.md create mode 100644 docs/discussions/alice_engineering_comms/0072/index.md create mode 100644 docs/discussions/alice_engineering_comms/0072/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0073/index.md create mode 100644 docs/discussions/alice_engineering_comms/0073/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0074/index.md create mode 100644 docs/discussions/alice_engineering_comms/0074/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0074/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0075/index.md create mode 100644 docs/discussions/alice_engineering_comms/0075/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0076/index.md create mode 100644 docs/discussions/alice_engineering_comms/0076/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0077/index.md create mode 100644 docs/discussions/alice_engineering_comms/0077/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0078/index.md create mode 100644 docs/discussions/alice_engineering_comms/0078/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0079/index.md create mode 100644 docs/discussions/alice_engineering_comms/0079/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0080/index.md create mode 100644 docs/discussions/alice_engineering_comms/0080/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0080/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0081/index.md create mode 100644 docs/discussions/alice_engineering_comms/0081/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0082/index.md create mode 100644 docs/discussions/alice_engineering_comms/0082/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0082/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0082/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0083/index.md create mode 100644 docs/discussions/alice_engineering_comms/0083/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0084/index.md create mode 100644 docs/discussions/alice_engineering_comms/0084/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0084/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0085/index.md create mode 100644 docs/discussions/alice_engineering_comms/0085/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0086/index.md create mode 100644 docs/discussions/alice_engineering_comms/0086/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0086/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0086/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0087/index.md create mode 100644 docs/discussions/alice_engineering_comms/0087/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0088/index.md create mode 100644 docs/discussions/alice_engineering_comms/0088/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0088/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0089/index.md create mode 100644 docs/discussions/alice_engineering_comms/0089/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0090/index.md create mode 100644 docs/discussions/alice_engineering_comms/0090/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0090/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0091/index.md create mode 100644 docs/discussions/alice_engineering_comms/0091/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0092/index.md create mode 100644 docs/discussions/alice_engineering_comms/0093/index.md create mode 100644 docs/discussions/alice_engineering_comms/0093/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0094/index.md create mode 100644 docs/discussions/alice_engineering_comms/0094/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0095/index.md create mode 100644 docs/discussions/alice_engineering_comms/0095/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0096/index.md create mode 100644 docs/discussions/alice_engineering_comms/0097/index.md create mode 100644 docs/discussions/alice_engineering_comms/0097/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0098/index.md create mode 100644 docs/discussions/alice_engineering_comms/0099/index.md create mode 100644 docs/discussions/alice_engineering_comms/0099/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0099/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0100/index.md create mode 100644 docs/discussions/alice_engineering_comms/0100/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0101/index.md create mode 100644 docs/discussions/alice_engineering_comms/0101/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0102/index.md create mode 100644 docs/discussions/alice_engineering_comms/0102/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0102/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0103/index.md create mode 100644 docs/discussions/alice_engineering_comms/0103/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0104/index.md create mode 100644 docs/discussions/alice_engineering_comms/0104/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0105/index.md create mode 100644 docs/discussions/alice_engineering_comms/0105/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0106/index.md create mode 100644 docs/discussions/alice_engineering_comms/0107/index.md create mode 100644 docs/discussions/alice_engineering_comms/0107/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0108/index.md create mode 100644 docs/discussions/alice_engineering_comms/0108/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0109/index.md create mode 100644 docs/discussions/alice_engineering_comms/0109/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0110/index.md create mode 100644 docs/discussions/alice_engineering_comms/0110/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0111/index.md create mode 100644 docs/discussions/alice_engineering_comms/0111/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0112/index.md create mode 100644 docs/discussions/alice_engineering_comms/0112/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0113/index.md create mode 100644 docs/discussions/alice_engineering_comms/0113/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0114/index.md create mode 100644 docs/discussions/alice_engineering_comms/0115/index.md create mode 100644 docs/discussions/alice_engineering_comms/0116/index.md create mode 100644 docs/discussions/alice_engineering_comms/0117/index.md create mode 100644 docs/discussions/alice_engineering_comms/0117/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0118/index.md create mode 100644 docs/discussions/alice_engineering_comms/0118/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0119/index.md create mode 100644 docs/discussions/alice_engineering_comms/0119/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0120/index.md create mode 100644 docs/discussions/alice_engineering_comms/0120/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0121/index.md create mode 100644 docs/discussions/alice_engineering_comms/0121/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0122/index.md create mode 100644 docs/discussions/alice_engineering_comms/0122/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0123/index.md create mode 100644 docs/discussions/alice_engineering_comms/0123/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0124/index.md create mode 100644 docs/discussions/alice_engineering_comms/0124/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0125/index.md create mode 100644 docs/discussions/alice_engineering_comms/0125/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0126/index.md create mode 100644 docs/discussions/alice_engineering_comms/0127/index.md create mode 100644 docs/discussions/alice_engineering_comms/0127/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0128/index.md create mode 100644 docs/discussions/alice_engineering_comms/0128/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0129/index.md create mode 100644 docs/discussions/alice_engineering_comms/0129/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0130/index.md create mode 100644 docs/discussions/alice_engineering_comms/0130/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0131/index.md create mode 100644 docs/discussions/alice_engineering_comms/0131/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0132/index.md create mode 100644 docs/discussions/alice_engineering_comms/0133/index.md create mode 100644 docs/discussions/alice_engineering_comms/0133/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0134/index.md create mode 100644 docs/discussions/alice_engineering_comms/0134/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0135/index.md create mode 100644 docs/discussions/alice_engineering_comms/0136/index.md create mode 100644 docs/discussions/alice_engineering_comms/0136/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0137/index.md create mode 100644 docs/discussions/alice_engineering_comms/0137/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0138/index.md create mode 100644 docs/discussions/alice_engineering_comms/0138/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0138/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0138/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0139/index.md create mode 100644 docs/discussions/alice_engineering_comms/0139/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0140/index.md create mode 100644 docs/discussions/alice_engineering_comms/0140/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0141/index.md create mode 100644 docs/discussions/alice_engineering_comms/0141/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0142/index.md create mode 100644 docs/discussions/alice_engineering_comms/0142/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0143/index.md create mode 100644 docs/discussions/alice_engineering_comms/0143/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0144/index.md create mode 100644 docs/discussions/alice_engineering_comms/0144/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0144/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0145/index.md create mode 100644 docs/discussions/alice_engineering_comms/0145/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0146/index.md create mode 100644 docs/discussions/alice_engineering_comms/0146/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0147/index.md create mode 100644 docs/discussions/alice_engineering_comms/0147/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0148/index.md create mode 100644 docs/discussions/alice_engineering_comms/0148/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0149/index.md create mode 100644 docs/discussions/alice_engineering_comms/0149/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0150/index.md create mode 100644 docs/discussions/alice_engineering_comms/0150/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0151/index.md create mode 100644 docs/discussions/alice_engineering_comms/0151/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0152/index.md create mode 100644 docs/discussions/alice_engineering_comms/0152/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0153/index.md create mode 100644 docs/discussions/alice_engineering_comms/0153/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0154/index.md create mode 100644 docs/discussions/alice_engineering_comms/0154/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0155/index.md create mode 100644 docs/discussions/alice_engineering_comms/0155/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0156/index.md create mode 100644 docs/discussions/alice_engineering_comms/0156/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0156/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0157/index.md create mode 100644 docs/discussions/alice_engineering_comms/0157/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0158/index.md create mode 100644 docs/discussions/alice_engineering_comms/0158/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0158/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0159/index.md create mode 100644 docs/discussions/alice_engineering_comms/0159/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0160/index.md create mode 100644 docs/discussions/alice_engineering_comms/0160/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0161/index.md create mode 100644 docs/discussions/alice_engineering_comms/0161/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0162/index.md create mode 100644 docs/discussions/alice_engineering_comms/0162/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0163/index.md create mode 100644 docs/discussions/alice_engineering_comms/0163/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0163/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0164/index.md create mode 100644 docs/discussions/alice_engineering_comms/0164/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0165/index.md create mode 100644 docs/discussions/alice_engineering_comms/0165/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0166/index.md create mode 100644 docs/discussions/alice_engineering_comms/0166/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0166/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0167/index.md create mode 100644 docs/discussions/alice_engineering_comms/0167/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0168/index.md create mode 100644 docs/discussions/alice_engineering_comms/0168/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0169/index.md create mode 100644 docs/discussions/alice_engineering_comms/0170/index.md create mode 100644 docs/discussions/alice_engineering_comms/0170/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0170/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0170/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0171/index.md create mode 100644 docs/discussions/alice_engineering_comms/0171/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0172/index.md create mode 100644 docs/discussions/alice_engineering_comms/0172/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0173/index.md create mode 100644 docs/discussions/alice_engineering_comms/0173/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0174/index.md create mode 100644 docs/discussions/alice_engineering_comms/0174/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0174/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0175/index.md create mode 100644 docs/discussions/alice_engineering_comms/0176/index.md create mode 100644 docs/discussions/alice_engineering_comms/0177/index.md create mode 100644 docs/discussions/alice_engineering_comms/0177/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0177/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0178/index.md create mode 100644 docs/discussions/alice_engineering_comms/0178/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0179/index.md create mode 100644 docs/discussions/alice_engineering_comms/0179/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0179/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0180/index.md create mode 100644 docs/discussions/alice_engineering_comms/0180/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0181/index.md create mode 100644 docs/discussions/alice_engineering_comms/0181/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0182/index.md create mode 100644 docs/discussions/alice_engineering_comms/0182/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0183/index.md create mode 100644 docs/discussions/alice_engineering_comms/0183/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0184/index.md create mode 100644 docs/discussions/alice_engineering_comms/0185/index.md create mode 100644 docs/discussions/alice_engineering_comms/0186/index.md create mode 100644 docs/discussions/alice_engineering_comms/0186/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0186/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0187/index.md create mode 100644 docs/discussions/alice_engineering_comms/0187/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0188/index.md create mode 100644 docs/discussions/alice_engineering_comms/0188/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0189/index.md create mode 100644 docs/discussions/alice_engineering_comms/0189/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0190/index.md create mode 100644 docs/discussions/alice_engineering_comms/0190/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0191/index.md create mode 100644 docs/discussions/alice_engineering_comms/0191/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0191/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0192/index.md create mode 100644 docs/discussions/alice_engineering_comms/0192/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0192/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0193/index.md create mode 100644 docs/discussions/alice_engineering_comms/0193/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0193/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0194/index.md create mode 100644 docs/discussions/alice_engineering_comms/0194/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0194/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0195/index.md create mode 100644 docs/discussions/alice_engineering_comms/0195/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0196/index.md create mode 100644 docs/discussions/alice_engineering_comms/0197/index.md create mode 100644 docs/discussions/alice_engineering_comms/0197/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0198/index.md create mode 100644 docs/discussions/alice_engineering_comms/0198/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0199/index.md create mode 100644 docs/discussions/alice_engineering_comms/0199/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0200/index.md create mode 100644 docs/discussions/alice_engineering_comms/0200/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0201/index.md create mode 100644 docs/discussions/alice_engineering_comms/0201/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0202/index.md create mode 100644 docs/discussions/alice_engineering_comms/0202/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0203/index.md create mode 100644 docs/discussions/alice_engineering_comms/0203/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0204/index.md create mode 100644 docs/discussions/alice_engineering_comms/0205/index.md create mode 100644 docs/discussions/alice_engineering_comms/0205/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0206/index.md create mode 100644 docs/discussions/alice_engineering_comms/0206/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0207/index.md create mode 100644 docs/discussions/alice_engineering_comms/0207/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0208/index.md create mode 100644 docs/discussions/alice_engineering_comms/0208/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0209/index.md create mode 100644 docs/discussions/alice_engineering_comms/0209/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0209/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0210/index.md create mode 100644 docs/discussions/alice_engineering_comms/0210/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0211/index.md create mode 100644 docs/discussions/alice_engineering_comms/0211/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0212/index.md create mode 100644 docs/discussions/alice_engineering_comms/0213/index.md create mode 100644 docs/discussions/alice_engineering_comms/0213/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0214/index.md create mode 100644 docs/discussions/alice_engineering_comms/0214/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0214/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0214/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0215/index.md create mode 100644 docs/discussions/alice_engineering_comms/0215/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0215/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0215/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0216/index.md create mode 100644 docs/discussions/alice_engineering_comms/0216/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0217/index.md create mode 100644 docs/discussions/alice_engineering_comms/0217/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0218/index.md create mode 100644 docs/discussions/alice_engineering_comms/0218/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0219/index.md create mode 100644 docs/discussions/alice_engineering_comms/0219/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0220/index.md create mode 100644 docs/discussions/alice_engineering_comms/0220/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0220/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0221/index.md create mode 100644 docs/discussions/alice_engineering_comms/0221/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0222/index.md create mode 100644 docs/discussions/alice_engineering_comms/0222/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0223/index.md create mode 100644 docs/discussions/alice_engineering_comms/0223/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0224/index.md create mode 100644 docs/discussions/alice_engineering_comms/0224/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0225/index.md create mode 100644 docs/discussions/alice_engineering_comms/0225/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0226/index.md create mode 100644 docs/discussions/alice_engineering_comms/0226/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0227/index.md create mode 100644 docs/discussions/alice_engineering_comms/0227/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0228/index.md create mode 100644 docs/discussions/alice_engineering_comms/0228/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0229/index.md create mode 100644 docs/discussions/alice_engineering_comms/0229/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0230/index.md create mode 100644 docs/discussions/alice_engineering_comms/0230/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0231/index.md create mode 100644 docs/discussions/alice_engineering_comms/0231/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0232/index.md create mode 100644 docs/discussions/alice_engineering_comms/0233/index.md create mode 100644 docs/discussions/alice_engineering_comms/0234/index.md create mode 100644 docs/discussions/alice_engineering_comms/0235/index.md create mode 100644 docs/discussions/alice_engineering_comms/0236/index.md create mode 100644 docs/discussions/alice_engineering_comms/0236/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0237/index.md create mode 100644 docs/discussions/alice_engineering_comms/0238/index.md create mode 100644 docs/discussions/alice_engineering_comms/0238/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0239/index.md create mode 100644 docs/discussions/alice_engineering_comms/0239/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0240/index.md create mode 100644 docs/discussions/alice_engineering_comms/0240/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0241/index.md create mode 100644 docs/discussions/alice_engineering_comms/0241/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0242/index.md create mode 100644 docs/discussions/alice_engineering_comms/0242/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0243/index.md create mode 100644 docs/discussions/alice_engineering_comms/0243/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0244/index.md create mode 100644 docs/discussions/alice_engineering_comms/0244/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0245/index.md create mode 100644 docs/discussions/alice_engineering_comms/0245/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0246/index.md create mode 100644 docs/discussions/alice_engineering_comms/0246/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0247/index.md create mode 100644 docs/discussions/alice_engineering_comms/0247/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0248/index.md create mode 100644 docs/discussions/alice_engineering_comms/0248/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0249/index.md create mode 100644 docs/discussions/alice_engineering_comms/0249/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0250/index.md create mode 100644 docs/discussions/alice_engineering_comms/0250/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0251/index.md create mode 100644 docs/discussions/alice_engineering_comms/0251/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0252/index.md create mode 100644 docs/discussions/alice_engineering_comms/0252/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0253/index.md create mode 100644 docs/discussions/alice_engineering_comms/0254/index.md create mode 100644 docs/discussions/alice_engineering_comms/0255/index.md create mode 100644 docs/discussions/alice_engineering_comms/0255/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0256/index.md create mode 100644 docs/discussions/alice_engineering_comms/0256/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0257/index.md create mode 100644 docs/discussions/alice_engineering_comms/0257/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0258/index.md create mode 100644 docs/discussions/alice_engineering_comms/0258/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0259/index.md create mode 100644 docs/discussions/alice_engineering_comms/0259/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0260/index.md create mode 100644 docs/discussions/alice_engineering_comms/0261/index.md create mode 100644 docs/discussions/alice_engineering_comms/0262/index.md create mode 100644 docs/discussions/alice_engineering_comms/0262/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0263/index.md create mode 100644 docs/discussions/alice_engineering_comms/0263/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0264/index.md create mode 100644 docs/discussions/alice_engineering_comms/0264/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0265/index.md create mode 100644 docs/discussions/alice_engineering_comms/0265/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0266/index.md create mode 100644 docs/discussions/alice_engineering_comms/0266/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0267/index.md create mode 100644 docs/discussions/alice_engineering_comms/0268/index.md create mode 100644 docs/discussions/alice_engineering_comms/0268/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0269/index.md create mode 100644 docs/discussions/alice_engineering_comms/0269/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0270/index.md create mode 100644 docs/discussions/alice_engineering_comms/0270/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0270/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0270/reply_0002.md create mode 100644 docs/discussions/alice_engineering_comms/0271/index.md create mode 100644 docs/discussions/alice_engineering_comms/0271/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0272/index.md create mode 100644 docs/discussions/alice_engineering_comms/0272/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0273/index.md create mode 100644 docs/discussions/alice_engineering_comms/0274/index.md create mode 100644 docs/discussions/alice_engineering_comms/0275/index.md create mode 100644 docs/discussions/alice_engineering_comms/0276/index.md create mode 100644 docs/discussions/alice_engineering_comms/0277/index.md create mode 100644 docs/discussions/alice_engineering_comms/0278/index.md create mode 100644 docs/discussions/alice_engineering_comms/0279/index.md create mode 100644 docs/discussions/alice_engineering_comms/0279/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0280/index.md create mode 100644 docs/discussions/alice_engineering_comms/0280/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0281/index.md create mode 100644 docs/discussions/alice_engineering_comms/0282/index.md create mode 100644 docs/discussions/alice_engineering_comms/0283/index.md create mode 100644 docs/discussions/alice_engineering_comms/0283/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0284/index.md create mode 100644 docs/discussions/alice_engineering_comms/0284/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0285/index.md create mode 100644 docs/discussions/alice_engineering_comms/0286/index.md create mode 100644 docs/discussions/alice_engineering_comms/0287/index.md create mode 100644 docs/discussions/alice_engineering_comms/0288/index.md create mode 100644 docs/discussions/alice_engineering_comms/0289/index.md create mode 100644 docs/discussions/alice_engineering_comms/0289/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0290/index.md create mode 100644 docs/discussions/alice_engineering_comms/0290/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0291/index.md create mode 100644 docs/discussions/alice_engineering_comms/0291/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0292/index.md create mode 100644 docs/discussions/alice_engineering_comms/0292/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0293/index.md create mode 100644 docs/discussions/alice_engineering_comms/0294/index.md create mode 100644 docs/discussions/alice_engineering_comms/0295/index.md create mode 100644 docs/discussions/alice_engineering_comms/0296/index.md create mode 100644 docs/discussions/alice_engineering_comms/0297/index.md create mode 100644 docs/discussions/alice_engineering_comms/0297/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0298/index.md create mode 100644 docs/discussions/alice_engineering_comms/0298/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0299/index.md create mode 100644 docs/discussions/alice_engineering_comms/0299/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0299/reply_0001.md create mode 100644 docs/discussions/alice_engineering_comms/0300/index.md create mode 100644 docs/discussions/alice_engineering_comms/0300/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0301/index.md create mode 100644 docs/discussions/alice_engineering_comms/0302/index.md create mode 100644 docs/discussions/alice_engineering_comms/0303/index.md create mode 100644 docs/discussions/alice_engineering_comms/0304/index.md create mode 100644 docs/discussions/alice_engineering_comms/0305/index.md create mode 100644 docs/discussions/alice_engineering_comms/0306/index.md create mode 100644 docs/discussions/alice_engineering_comms/0307/index.md create mode 100644 docs/discussions/alice_engineering_comms/0308/index.md create mode 100644 docs/discussions/alice_engineering_comms/0309/index.md create mode 100644 docs/discussions/alice_engineering_comms/0310/index.md create mode 100644 docs/discussions/alice_engineering_comms/0311/index.md create mode 100644 docs/discussions/alice_engineering_comms/0312/index.md create mode 100644 docs/discussions/alice_engineering_comms/0313/index.md create mode 100644 docs/discussions/alice_engineering_comms/0313/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0314/index.md create mode 100644 docs/discussions/alice_engineering_comms/0314/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0315/index.md create mode 100644 docs/discussions/alice_engineering_comms/0316/index.md create mode 100644 docs/discussions/alice_engineering_comms/0317/index.md create mode 100644 docs/discussions/alice_engineering_comms/0318/index.md create mode 100644 docs/discussions/alice_engineering_comms/0319/index.md create mode 100644 docs/discussions/alice_engineering_comms/0319/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0320/index.md create mode 100644 docs/discussions/alice_engineering_comms/0321/index.md create mode 100644 docs/discussions/alice_engineering_comms/0322/index.md create mode 100644 docs/discussions/alice_engineering_comms/0322/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0323/index.md create mode 100644 docs/discussions/alice_engineering_comms/0324/index.md create mode 100644 docs/discussions/alice_engineering_comms/0325/index.md create mode 100644 docs/discussions/alice_engineering_comms/0326/index.md create mode 100644 docs/discussions/alice_engineering_comms/0327/index.md create mode 100644 docs/discussions/alice_engineering_comms/0328/index.md create mode 100644 docs/discussions/alice_engineering_comms/0328/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0329/index.md create mode 100644 docs/discussions/alice_engineering_comms/0330/index.md create mode 100644 docs/discussions/alice_engineering_comms/0331/index.md create mode 100644 docs/discussions/alice_engineering_comms/0332/index.md create mode 100644 docs/discussions/alice_engineering_comms/0333/index.md create mode 100644 docs/discussions/alice_engineering_comms/0334/index.md create mode 100644 docs/discussions/alice_engineering_comms/0335/index.md create mode 100644 docs/discussions/alice_engineering_comms/0336/index.md create mode 100644 docs/discussions/alice_engineering_comms/0337/index.md create mode 100644 docs/discussions/alice_engineering_comms/0338/index.md create mode 100644 docs/discussions/alice_engineering_comms/0339/index.md create mode 100644 docs/discussions/alice_engineering_comms/0340/index.md create mode 100644 docs/discussions/alice_engineering_comms/0341/index.md create mode 100644 docs/discussions/alice_engineering_comms/0341/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0342/index.md create mode 100644 docs/discussions/alice_engineering_comms/0343/index.md create mode 100644 docs/discussions/alice_engineering_comms/0344/index.md create mode 100644 docs/discussions/alice_engineering_comms/0345/index.md create mode 100644 docs/discussions/alice_engineering_comms/0346/index.md create mode 100644 docs/discussions/alice_engineering_comms/0346/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0347/index.md create mode 100644 docs/discussions/alice_engineering_comms/0347/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0348/index.md create mode 100644 docs/discussions/alice_engineering_comms/0348/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0349/index.md create mode 100644 docs/discussions/alice_engineering_comms/0349/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0350/index.md create mode 100644 docs/discussions/alice_engineering_comms/0351/index.md create mode 100644 docs/discussions/alice_engineering_comms/0351/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0352/index.md create mode 100644 docs/discussions/alice_engineering_comms/0353/index.md create mode 100644 docs/discussions/alice_engineering_comms/0354/index.md create mode 100644 docs/discussions/alice_engineering_comms/0355/index.md create mode 100644 docs/discussions/alice_engineering_comms/0356/index.md create mode 100644 docs/discussions/alice_engineering_comms/0357/index.md create mode 100644 docs/discussions/alice_engineering_comms/0358/index.md create mode 100644 docs/discussions/alice_engineering_comms/0359/index.md create mode 100644 docs/discussions/alice_engineering_comms/0360/index.md create mode 100644 docs/discussions/alice_engineering_comms/0361/index.md create mode 100644 docs/discussions/alice_engineering_comms/0362/index.md create mode 100644 docs/discussions/alice_engineering_comms/0363/index.md create mode 100644 docs/discussions/alice_engineering_comms/0364/index.md create mode 100644 docs/discussions/alice_engineering_comms/0365/index.md create mode 100644 docs/discussions/alice_engineering_comms/0366/index.md create mode 100644 docs/discussions/alice_engineering_comms/0367/index.md create mode 100644 docs/discussions/alice_engineering_comms/0368/index.md create mode 100644 docs/discussions/alice_engineering_comms/0369/index.md create mode 100644 docs/discussions/alice_engineering_comms/0370/index.md create mode 100644 docs/discussions/alice_engineering_comms/0370/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0371/index.md create mode 100644 docs/discussions/alice_engineering_comms/0372/index.md create mode 100644 docs/discussions/alice_engineering_comms/0373/index.md create mode 100644 docs/discussions/alice_engineering_comms/0374/index.md create mode 100644 docs/discussions/alice_engineering_comms/0374/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/0375/index.md create mode 100644 docs/discussions/alice_engineering_comms/0376/index.md create mode 100644 docs/discussions/alice_engineering_comms/0377/index.md create mode 100644 docs/discussions/alice_engineering_comms/0378/index.md create mode 100644 docs/discussions/alice_engineering_comms/0379/index.md create mode 100644 docs/discussions/alice_engineering_comms/0380/index.md create mode 100644 docs/discussions/alice_engineering_comms/0381/index.md create mode 100644 docs/discussions/alice_engineering_comms/0381/reply_0000.md create mode 100644 docs/discussions/alice_engineering_comms/index.md diff --git a/docs/discussions/alice_engineering_comms/0000/index.md b/docs/discussions/alice_engineering_comms/0000/index.md new file mode 100644 index 0000000000..6aeabe6c02 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0000/index.md @@ -0,0 +1,7 @@ +# 2022-07-18 Engineering Logs + +- TODO + - [x] @aliceoa, @pdxjohnny: Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0000/reply_0000.md b/docs/discussions/alice_engineering_comms/0000/reply_0000.md new file mode 100644 index 0000000000..6d2ddc6a56 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0000/reply_0000.md @@ -0,0 +1,23 @@ +## 2022-07-18 @pdxjohnny Engineering Logs + +- TODO + - [x] Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting +- Future + - Engage with Loihi community + - See what we can do here, not sure yet, play with system context / mitigation inference in devcloud? + - https://www.intel.com/content/www/us/en/research/neuromorphic-community.html + - https://download.intel.com/newsroom/2021/new-technologies/neuromorphic-computing-loihi-2-brief.pdf + - https://www.intel.com/content/www/us/en/newsroom/news/intel-unveils-neuromorphic-loihi-2-lava-software.html + - +- References + - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b + - https://www.microsoft.com/security/blog/2021/10/06/microsofts-5-guiding-principles-for-decentralized-identities/ + - https://ariadne.space/2022/07/17/how-efficient-can-cat1-be/ + - Usage of splice + - https://github.com/NVlabs/eg3d + - Seeing from different perspectives, inter domain conceptual mapping, encoded sysctxs alternate mitigations + - https://github.com/robmarkcole/satellite-image-deep-learning + - Knitting together system contexts (Alice could use for integration of various architectures) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0001/index.md b/docs/discussions/alice_engineering_comms/0001/index.md new file mode 100644 index 0000000000..313788966c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0001/index.md @@ -0,0 +1,7 @@ +# 2022-07-19 Engineering Logs + +- TODO + - [x] @aliceoa, @pdxjohnny: Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0001/reply_0000.md b/docs/discussions/alice_engineering_comms/0001/reply_0000.md new file mode 100644 index 0000000000..5268db9903 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0001/reply_0000.md @@ -0,0 +1,121 @@ +## 2022-07-19 @pdxjohnny Engineering Logs + +- TODO + - [x] Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting + - [ ] Follow up with OneAPI folks + - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages + - [ ] Associated tutorial + - [ ] Linked from `README` + - [ ] Finish out `alice please contribute recommended community standards` + dynamic opimp for meta issue body creation + - [ ] Associated tutorial + - [ ] Linked from `README` and `CONTRIBUTING` +- Some good spdx DAG stuff on how we turn source into build SBOM wise + - https://lists.spdx.org/g/Spdx-tech/message/4659 +- References + - https://github.com/nsmith5/rekor-sidekick + - > Rekor transparency log monitoring and alerting + - Leverages Open Policy Agent + - Found while looking at Open Policy Agent to see if we can serialize to JSON. + - Possibly use to facilitate our downstream validation + - https://github.com/intel/dffml/issues/1315 + - https://mermaid-js.github.io/mermaid/#/c4c + - Mermaid is working on native https://c4model.com support! + - W3C approves DIDs! + - https://blog.avast.com/dids-approved-w3c + - https://www.w3.org/blog/news/archives/9618 + - https://www.w3.org/2022/07/pressrelease-did-rec.html.en + - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments + +> "Intel Corporation congratulates the DID Working Group on Decentralized Identifier (DID) 1.0 reaching W3C Recommendation status. +> +> DID provides a framework to unify and consolidate multiple evolving identity systems. Consolidating identity systems within a single framework is useful for validating the authenticity of information and preserving its integrity as it is moved and processed among cloud, edge, and client systems. This potentially increases the capabilities of the Web to connect and unify multiple sources of information. +> +> The continuing evolution of this work will be key to the development of new technologies in the fields of supply chain management and Internet of Things (IoT) devices and services. For example, a Birds of a Feather (BOF) discussion group at IETF [Supply Chain Integrity, Transparency, and Trust (SCITT)](https://datatracker.ietf.org/doc/bofreq-birkholz-supply-chain-integrity-transparency-and-trust-scitt/) has already highlighted DID as a useful approach in providing much needed structure for exchanging information through the supply chain, and the Web of Things (WoT) WG is planning to support DID for identifying and discovering IoT devices and metadata. +> +> Intel Corporation supports this work and encourages the DID Working Group to continue working towards the convergence of widely implemented and adopted standardized best practices for identity in its next charter." +> +> Eric Siow, Web Standards and Ecosystem Strategies Director, Intel Corporation + + + + +- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b + +> ![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png) + +- For Vol 3, on mind control + - https://bigthink.com/the-present/sophists/ + +--- + +Unsent to Mike Scovetta: michael.scovetta (at) microsoft.com + +Hi Mike, + +Hope you’ve been well. It’s John from Intel. Thanks again to you and the team for welcoming me to the Identifying Security Threats working group meeting [2021-02-18](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit#heading=h.mfw2bj5svu9u) last year. We talked a bit about how Intel had a similar effort. I then changed roles hoping to get more involved with OpenSSF but then ended up getting told to be uninvolved. Now I switched roles again and involvement is in scope! Sorry for the lapse in communications. + +I periodically check the minutes so I joined today and asked about the "Alpha-Omega" project from last week’s minutes which I then did some research on. We just started what looks to me to be an aligned project, coincidentally named Alice Omega Alpha: https://github.com/intel/dffml/tree/alice/entities/alice + +It looks to me like Alice's mission to proactively enable developers and organizations to deliver organizationally context aware, adaptive secure by default best practices to teams aligns with project Alpha-Omega’s goals. + +Alice is the nickname for both the entity and the architecture, the Open Architecture, which is a methodology for interpretation of existing well established, formats, protocols, and other domain specific representations of architecture. What we end up with is some JSON, YAML, or other blob of structured data that we can use to build cross language tooling focused more on policy and intent, incorporating data from arbitrary sources to create a holistic picture of software across dependency boundaries by focusing on threat models. + +Alice will be doing scans of open source projects and we’d still love to collaborate to contribute metrics to the OpenSSF metrics database, we can easily have her shoot applicable metrics off to that DB. We’ve also been looking at fusing VEX and DIDs to facilitate distributed vulnerability disclosure and patch distribution. + +--- + +Unset to Jun Takei: jun.takei (at) intel.com + +The W3C today issued the recommendation on DIDs. Jun I saw from Eric's +comment on the press release that the SCITT working group has an SCITT +Architecture which DID's might be suitable for. + +The DFFML community is working on a project called Alice +https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice +she is intended to be a developer helper. She's also the way we data mine +source repositories (etc.). + +She’s open source with a plugin system ("overlays") so we can write open source code +and then just add our internal integrations. This system relies on an abstraction of +architecture known as the Open Architecture. The Open Architecture, also known as +Alice, is a methodology for interpreting directed graphs of domain specific architectures. +Alice is the name we give both the entity and the architecture. We are hoping to +have Alice store and process information backed by directed graphs of DIDs, SBOMs, and +VEX info primarily. This sounds very similar to the SCITT Architecture. We would love to +collaborate with you both to help make SCITT a success. Alice is focused on analysis of +our software supply chain so as to ensure we conform to best practices. We would like +the analysis to serialize directly to an industry best practice format for that as well, +which SCITT looks to be. + +To increase the level of trust in our supply chain we would like to ensure interoperability +up and down the stack. Ned is involved in the DICE space and communicated to me +that + +Please let us know where things are at with your involvement with DIDs and SCITT so we +can be in sync with Intel's involvement and direction in this space. Please also let us know +how we could best establish an ongoing line of communication so as to build off and +contribute to where possible the work you're involved in. + +References: +- https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture +- https://www.w3.org/2022/07/pressrelease-did-rec.html.en +- https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture + +--- + +Unsent + +To: Jun and Mike and Dan lorenc.d (at) gmail.com + +I commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at +as a way to exchange vulnerability information. + +We've been looking potentially at a hybrid DID plus rekor +architecture (DIDs eventually as a proxy to) + +References: +- https://github.com/sigstore/rekor diff --git a/docs/discussions/alice_engineering_comms/0002/index.md b/docs/discussions/alice_engineering_comms/0002/index.md new file mode 100644 index 0000000000..3fb0b53df1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0002/index.md @@ -0,0 +1,18 @@ +# 2022-07-20 Engineering Logs + +- TODO + - [x] @aliceoa, @pdxjohnny: Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting + - [ ] @dffml: Get involved in SCITT + - [ ] Meetings + - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit# + - Weekly Monday at 8 AM Pacific + - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09 + - [x] Mailing list + - https://www.ietf.org/mailman/listinfo/scitt + - https://mailarchive.ietf.org/arch/browse/scitt/ + - [ ] Slack + - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/ + - Going to email Orie Steele orie (at) transmute.industries to ask for an invite. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0002/reply_0000.md b/docs/discussions/alice_engineering_comms/0002/reply_0000.md new file mode 100644 index 0000000000..1dfd36255c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0002/reply_0000.md @@ -0,0 +1,43 @@ +## 2022-07-20 @pdxjohnny Engineering Logs + +- TODO + - [x] Get involved in SCITT + - [ ] Meetings + - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit# + - Weekly Monday at 8 AM Pacific + - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09 + - [x] Mailing list + - https://www.ietf.org/mailman/listinfo/scitt + - https://mailarchive.ietf.org/arch/browse/scitt/ + - [ ] Slack + - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/ + - Going to email Orie Steele orie (at) transmute.industries to ask for an invite. + - [x] Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting + - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages + - [ ] Associated tutorial + - [ ] Linked from `README` + - [ ] Finish out `alice please contribute recommended community standards` + dynamic opimp for meta issue body creation + - [ ] Associated tutorial + - [ ] Linked from `README` and `CONTRIBUTING` +- References + - https://static.sched.com/hosted_files/ossna2022/9b/presentation.pdf + - > We're starting to put everything in registries, container images, signatures, SBOMs, attestations, cat pictures, we need to slow down. Our CI pipelines are designed to pass things as directories and files between stages, why aren't we doing this with our container images? OCI already defines an Image Layout Specification that defines how to structure the data on disk, and we should normalize how this is used in our tooling. This talk looks at the value of using the OCI Layout spec, what you can do today, what issues we're facing, and a call to action for more standardization between tooling in this space. + +--- + +Unsent + +To: Jun and Mike and Yan + +I commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at +as a way to exchange vulnerability information. + +We've been looking potentially at a hybrid DID plus rekor +architecture (DIDs eventually as a proxy to) + +References: +- https://github.com/sigstore/rekor diff --git a/docs/discussions/alice_engineering_comms/0002/reply_0001.md b/docs/discussions/alice_engineering_comms/0002/reply_0001.md new file mode 100644 index 0000000000..4e8ee8b66c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0002/reply_0001.md @@ -0,0 +1,121 @@ +# 2022-07-20 Identifying Security Threats WG + +- Mike leading +- Marta + - Office hours + - Place for Open Source maintainers to be able to ask community of security experts + - Idea is to run first two sessions in August and September + - Proposing two different timeslots to cover all geos + - Question is will be be well staffed for both of those + - She is collecting feedback right now on possibilities for those dates + - Pinging folks who have shown interest in the past + - What format? + - People just show up and ask + - Registration with topic they want to talk about + - Allows us to prepare, consensus is currently we like this + - Can grab right experts beforehand this way + - Reaching out to logistics team for how we can communicate + zoom links, etc. + - Will test registration beginning of August + - Will do doodle poll or something for slots + - Jen is the one for the Zoom setup + - Amir from ostif.org volunteering to answer questions + - May want to do a blog in addition to twitter + - Outreach maybe 4th or 5th, have the twitter points back + to the blog to capture that. +- Meeting time update + - We have been doing this at this time for about a year or so + - We previously alternated between two timeslots for Europe and Asia + - Should we keep this 10 AM Pacific timeslot? + - Alternate between US and APAC friendly timezone + - Most other WGs are morning Pacific time +- Technical Advisory Committee (TAC) update + - They are tasked with making sure we are delivering on our + cohesive promise, part of that is visuabliity and transparency + into the work that we do. + - We now have a formal reporting process + - It's not a periodic we're all invited to show up to the TAC meeting + one slide per project. + - What we're doing + - Why we're doing it + - It's meant as an FYI, we are not asking for approval, we're letting them + know what we're up to. + - Everyone who is driving a process or project or thing, please send Mike + a single slide, what is it, why are we doing it, what the status is, + what's coming next, and if you need anything + - Christine on metrics + - Luigi for SECURITY-INSIGHTS.yml` + - Mike will send out a template + - Please fill and respond by Monday + - Mike says the metrics work should live under a working group, maybe this one, maybe best practices + - CRob might have an opinion here, as long as work gets done + - As an org OpenSSF would benefit by being less siloed + - Question on if we should align to streams? + - LFX specific definition of metrics in mobilization paper + - AR for Christine to sync with CRob and see what he thinks. + - Will raise with TAC next week. +- A few action items for metrics from Christine + - Working groups are adopting streams from the mobilization plans +- Mike: Alpha Omega + - A few people were on the public call earlier + - The recording will be on YouTube + - Mike will give the fast version of the presentation right now + - They are still hiring + - Exploring ways of allocating headcount other than direct hiring + - If you know anyone or are interested please apply or ping them! + - Alpha + - Announced Node, Python, Eclipse + - Omega + - Toolchain is pending + - Waiting for legal approval due to the way the license for CodeQL works + - Had a CVE in Node that got fixed earlier this month + - RCE in JSHint that was bitrotted (unused) we removed + - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA) + - Found NodeJS vuln via systemcall tracing + - It tires to query `openssl.cnf` and dumps strace logs to a repo + - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query + - John: Sounds aligned with Alice's goals + - https://sos.dev coming under Alpha-Omega + - Allows us to compensate dev directly + - How to participate + - Improve security tools + - https://sos.dev + - Join working groups + - Get on slack +- Amir: Security Reviews + - Repo is looking good + - Updating with four new audits that ostif.org published last week + - At almost 100 reviews from Mike (Omega work), ostif.org, and community + - We're gaining traction, getting good stuff in there all the time + - Might need some help with the automated testing that get's done + when we upload reviews. + - Feedback always welcome. +- John: Collection of metric / Alpha-Omega data into shared DB + - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice + - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture + - https://www.w3.org/2022/07/pressrelease-did-rec.html.en + - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture + - Mike + - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details. + - If metircs runs across you're repo and you have 30 contributors, great + - As consumer, how can I discover that fact and trust that it's accruate + - Could immaiget a world where things like Scorecard express the data as as SCITT assursion + - You go and query that store and you say tell me everythig you know about foo and you get it all back + - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like. + - John: We can do rekor for now, we'll bridge it all later target 1-2 years out + - John: We have alignment. Time to execute. rekor + sigstore for metric data atteststation signed with github odic tokens. We care about data provenance. We will later bridge into web5 space used as central points of comms given DID as effectively the URL or the future. This is in realtion to what we talked to Melvin about with data provenance. We need to start planning how we are going to build up this space now so we can have provenance on thoughts later. This provenance could be for example on inference derived from provenance from training data and model training env and config. This will allow us to ensure the prioritizer make decisions based on Sprit of the law / aka intent based policy derived from Trinity of Static Analysis, Dynamic Analysis, and Human Intent. + - Living Threat Model threats, mitigations, trust boundaries as initial data set for cross domain conceptual mapping of the the trinity to build pyramid of thought alignment to strategic principles. + - One of our strategic plans / principles says: "We must be able to trust the sources of all input data used for all model training was done from research studies with these ethical certifications" + - This allows us to write policies (Open Policy Agent to JSON to DID/VC/SCITT translation/application exploration still in progress) for the organizations we form and apply them as overlays to flows we execute where context appropriate. These overlaid flows define the trusted parties within that context as applicable to the active organizational policies as applicable to the top level system context. + - The policy associated with the principle that consumes the overlaid trust attestations we will implement and LTM auditor for which checks the SCITT provenance information associated with the operation implementations and the operation implementation network, input network, etc. within the orchestrators trust boundary (TODO need to track usages / `reuse` of contexts `ictx`, `nctx`, etc. with something predeclared, aka at runtime if your `Operation` data structure doesn't allowlist your usage of it you can pass it to a subflow for reuse. This allows us to use the format within our orchrestration and for static analysis because we can use this same format to describe the trust boundry proeprties that other domain sepcific represenatations of architecture have, for instance we could if we were doing and Open Architecture (OA) Intermediate Representation (IR) for and ELF file we might note that the input network context is not reused from the top level system context. Where as if we did an OA IR for Python code we would say that the input network is reused from the top level system context (it has access to that memory region, whereas when you launch and ELF you look access to the parents memory region, typically). + - Christine + - Looking at trying to connect all the different data sources +- References + - [Meeting Notes](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit?usp=sharing) + - [GitHub Workgroup Page](https://github.com/ossf/wg-identifying-security-threats) + - [OpenSSF Slack](https://slack.openssf.org) + - [Metric Dashboard](https://metrics.openssf.org) +- TODO + - @pdxjohnny + - [ ] Reach out to Christine about metrics collaboration + - [ ] Respond with slides for Mike if he asks \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0003/index.md b/docs/discussions/alice_engineering_comms/0003/index.md new file mode 100644 index 0000000000..db8732d3a6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0003/index.md @@ -0,0 +1,5 @@ +# 2022-07-21 Engineering Logs + +- https://docs.rs/differential-dataflow/latest/differential_dataflow/ +- https://lists.spdx.org/g/Spdx-tech/message/4673 + - > It is not just a matter of your software, it is a fundamental design question whether to maintain separation between the logical model and its serializations. Maintaining separation shouldn't be a matter of personal preference, it's good software engineering. The OWL Web Ontology Language https://www.w3.org/TR/owl2-overview/ has an excellent diagram illustrating the separation between semantics and syntax. Several serializations are defined in OWL (Manchester Syntax, Functional Syntax, RDF/XML, OWL/XML, and Turtle), and more syntaxes have been added since (JSON-LD, RDF-star, ...). \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0004/index.md b/docs/discussions/alice_engineering_comms/0004/index.md new file mode 100644 index 0000000000..a8436709bd --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0004/index.md @@ -0,0 +1,3 @@ +# 2022-07-23 + +- https://blog.ciaranmcnulty.com/2022-05-12-multiple-build-contexts \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0005/index.md b/docs/discussions/alice_engineering_comms/0005/index.md new file mode 100644 index 0000000000..74991f6647 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0005/index.md @@ -0,0 +1,5 @@ +# 2022-07-28 Alice Intelligence/Open Architecture Working Group Initial Meeting + +- Meeting info + - 8-9 AM Pacific + - https://meet.google.com/kox-ssqn-kjd \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0006/index.md b/docs/discussions/alice_engineering_comms/0006/index.md new file mode 100644 index 0000000000..52ebe9d1d1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0006/index.md @@ -0,0 +1,18 @@ +# 2022-07-25 Engineering Logs + +- TODO + - [x] @aliceoa, @pdxjohnny: Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting + - [ ] @dffml: Get involved in SCITT + - [x] Meetings + - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit# + - Weekly Monday at 8 AM Pacific + - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09 + - [x] Mailing list + - https://www.ietf.org/mailman/listinfo/scitt + - https://mailarchive.ietf.org/arch/browse/scitt/ + - [ ] Slack + - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/ + - Going to email Orie Steele orie (at) transmute.industries to ask for an invite. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0006/reply_0000.md b/docs/discussions/alice_engineering_comms/0006/reply_0000.md new file mode 100644 index 0000000000..602ce87309 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0006/reply_0000.md @@ -0,0 +1,29 @@ +## 2022-07-25 @pdxjohnny Engineering Logs + +- TODO + - [ ] Get involved in SCITT + - [x] Meetings + - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit# + - Weekly Monday at 8 AM Pacific + - Joining today + - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09 + - [x] Mailing list + - https://www.ietf.org/mailman/listinfo/scitt + - https://mailarchive.ietf.org/arch/browse/scitt/ + - [ ] Slack + - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/ + - Going to email Orie Steele orie (at) transmute.industries to ask for an invite. + - [x] Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting + - [ ] Follow up with OneAPI folks + - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages + - [ ] Associated tutorial + - [ ] Linked from `README` + - [ ] Finish out `alice please contribute recommended community standards` + dynamic opimp for meta issue body creation + - [ ] Associated tutorial + - [ ] Linked from `README` and `CONTRIBUTING` +- References + - https://spdx.github.io/canonical-serialisation/ \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0006/reply_0001.md b/docs/discussions/alice_engineering_comms/0006/reply_0001.md new file mode 100644 index 0000000000..dbe6ae2589 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0006/reply_0001.md @@ -0,0 +1,102 @@ +## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT) + +- TODO + - [ ] Get involved in SCITT + - [x] Meetings + - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit# + - Weekly Monday at 8 AM Pacific + - Joining today + - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09 + - [x] Mailing list + - https://www.ietf.org/mailman/listinfo/scitt + - https://mailarchive.ietf.org/arch/browse/scitt/ + - [ ] Slack + - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/ + - Going to email Orie Steele orie (at) transmute.industries to ask for an invite. +- Links + - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956 + - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247 + - https://github.com/transmute-industries/openssl-did-web-tutorial +- Folks will be in Philly on Thursday for meeting + - There is a remote link (see mailing list?) for the Thursday meeting +- Typical global warming centric chit chat +- Others at RATs meeting or busy with other IETF activities +- Introductions + - Kelvin Cusack + - Filling in for John from his org + - John Andersen + - Connecting dots between this and OpenSSF + - Yogesh was excited to see someone from Intel + - Intel is involved in RATs but not as much here + - Kiran Karunakaran + - Microsoft + - On Kay Williams's team + - Will likely lead this meeting in the future +- Upcoming Birds of Feather (BoF) + - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp + - There is registration for remote + - Problem Statement currently scoped around software + - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT. + - In order to form it we have to have stuffiest people show interest in the problem space + - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress + - The early design proposal for the SCITT transparency service is that the service is content agnostic + - any kind of metadata could be provided and retrieved + - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products + - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.) + - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software + - Leadership thought scope was too big at first +- Thoughts around scope + - Charter is focused on software, an attainable goal + - Once we have a WG we can later broaden the scope via re-chartering +- We will design something that works for hardware and for software + - We are hanging software window curtains but we are looking at everything +- Software systems interact with everything else + - Dick Brooks (REA), any manifest could be signed and processes with SCITT + - It's just metadata, what was issued, how it was issued + - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live. +- Opens + - Open Policy Agent (mentioned in meeting minutes doc future topics) + - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT? + - Policy can be used in two places + - Policy for what can be put onto register + - Some registries might constrain themselves for what types of data they allow + - Policy for someone evaluating the contents of the registry to make discussions for fitness of use + - REGO also considered as a policy language + - Perhaps decide on multi + - This policy discussion will happen in this WG for now, then maybe a sub working group + - Dick Brooks: Mentions HR4081 + - On topic for what we are + - Talked about attestations for devices containing a camera and microphone and is connected to the internet + - There will need to be an attestation from the device + - Dick submitted to local rep to include software attestations as well + - https://www.congress.gov/bill/117th-congress/house-bill/4081 +https://www.congress.gov/bill/117th-congress/house-bill/4081/text +- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger +- Dick contacted his congress person to ask to add an amendment to HR4081 + - Amendment for smartphone apps to provide a trust score + - Tie in with OpenSSF metrics database to grab the security of repos involved + - Dick + - Proposed amendment I mentioned for HR 4081: + - "Require smart phone app stores to include a software supply chain trust score for each app". This gives consumers the ability to check trustworthiness before installing an app, +- Ray: think about attestations is different than the transparency ledger + - Thinks it's a lot to bite off to do both + - Are there IoT folks that might have more attestation experience we could tap into? + - Is there a sub-working group focused on device attestations (in response to HR4081) + - Device attestations could be recorded tin the transparency ledger + - TCG DICE WG is a target point of engagement (UCAN, CBOR, DID). + - https://trustedcomputinggroup.org/work-groups/dice-architectures/ + - https://www.trustedcomputinggroup.org/wp-content/uploads/Device-Identifier-Composition-Engine-Rev69_Public-Review.pdf + - Looking at hardware actively attesting + - Microsoft has an Open Source implementation on GitHub + - This attestation stuff starts to look at real life commerce, Ray thinks it's important to +- Joshua Lock + - On software attestations, I have been working on a page for the SLSA website to describe the model we're working with. I can share to the SCITT list once the change is merged. + - IIRC the SCITT draft standards refer to a 'software attestation" as a "claim", to disambiguate from RATS & TCG attestations +- Remote Attestation and Device Attestation + - Embraced COSE and CBOR + - Also in SCITT + - Hopefully we converge on underlying formats for both in-toto style and remote attestation style attestations +- There are also NIST attestations +- Vuln information mentioned by Kay as possible content inserted into SCITT + - This is a goal of ours with our CVE Binary Tool engagement + - We also could encode SBOMs from the systems that built them, we could patch sigstore to insert into a SCITT ledger \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0007/index.md b/docs/discussions/alice_engineering_comms/0007/index.md new file mode 100644 index 0000000000..8f1f421f47 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0007/index.md @@ -0,0 +1,18 @@ +# 2022-07-26 Engineering Logs + +- TODO + - [x] @aliceoa, @pdxjohnny: Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting + - [ ] @dffml: Get involved in SCITT + - [x] Meetings + - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit# + - Weekly Monday at 8 AM Pacific + - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09 + - [x] Mailing list + - https://www.ietf.org/mailman/listinfo/scitt + - https://mailarchive.ietf.org/arch/browse/scitt/ + - [ ] Slack + - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/ + - Going to email Orie Steele orie (at) transmute.industries to ask for an invite. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0007/reply_0000.md b/docs/discussions/alice_engineering_comms/0007/reply_0000.md new file mode 100644 index 0000000000..9f2e5dcd54 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0007/reply_0000.md @@ -0,0 +1,63 @@ +## 2022-07-26 @pdxjohnny Engineering Logs + +- TODO + - [ ] Get involved in SCITT + - [x] Meetings + - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit# + - Weekly Monday at 8 AM Pacific + - Joining today + - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09 + - [x] Mailing list + - https://www.ietf.org/mailman/listinfo/scitt + - https://mailarchive.ietf.org/arch/browse/scitt/ + - [ ] Slack + - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/ + - Going to email Orie Steele orie (at) transmute.industries to ask for an invite. + - [x] Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting + - [ ] Follow up with OneAPI folks + - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages + - [ ] Associated tutorial + - [ ] Linked from `README` + - [ ] Finish out `alice please contribute recommended community standards` + dynamic opimp for meta issue body creation + - [ ] Associated tutorial + - [ ] Linked from `README` and `CONTRIBUTING` + - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it. + - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt) + +![Software Analysis Trinity drawio](https://user-images.githubusercontent.com/5950433/181014158-4187950e-d0a4-4d7d-973b-dc414320e64f.svg) + +- Update current overlays to have lock taken on `AliceGitRepo` and then subflows with `ReadmeGitRepo` and `ContributingGitRepo`. + - This way the parent flow locks and they don't have to worry about loosing the lock between operations. + +```console +$ git grep -C 22 run_custom +alice/please/contribute/recommended_community_standards/cli.py- async def cli_run_on_repo(self, repo: "CLIRunOnRepo"): +alice/please/contribute/recommended_community_standards/cli.py- # TODO Similar to Expand being an alias of Union +alice/please/contribute/recommended_community_standards/cli.py- # +alice/please/contribute/recommended_community_standards/cli.py- # async def cli_run_on_repo(self, repo: 'CLIRunOnRepo') -> SystemContext[StringInputSetContext[AliceGitRepo]]: +alice/please/contribute/recommended_community_standards/cli.py- # return repo +alice/please/contribute/recommended_community_standards/cli.py- # +alice/please/contribute/recommended_community_standards/cli.py- # Or ideally at class scope +alice/please/contribute/recommended_community_standards/cli.py- # +alice/please/contribute/recommended_community_standards/cli.py- # 'CLIRunOnRepo' -> SystemContext[StringInputSetContext[AliceGitRepo]] +alice/please/contribute/recommended_community_standards/cli.py- async with self.parent.__class__(self.parent.config) as custom_run_dataflow: +alice/please/contribute/recommended_community_standards/cli.py- async with custom_run_dataflow( +alice/please/contribute/recommended_community_standards/cli.py- self.ctx, self.octx +alice/please/contribute/recommended_community_standards/cli.py- ) as custom_run_dataflow_ctx: +alice/please/contribute/recommended_community_standards/cli.py- # This is the type cast +alice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.op = self.parent.op._replace( +alice/please/contribute/recommended_community_standards/cli.py- inputs={ +alice/please/contribute/recommended_community_standards/cli.py- "repo": AlicePleaseContributeRecommendedCommunityStandards.RepoString +alice/please/contribute/recommended_community_standards/cli.py- } +alice/please/contribute/recommended_community_standards/cli.py- ) +alice/please/contribute/recommended_community_standards/cli.py- # Set the dataflow to be the same flow +alice/please/contribute/recommended_community_standards/cli.py- # TODO Reuse ictx? Is that applicable? +alice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.config.dataflow = self.octx.config.dataflow +alice/please/contribute/recommended_community_standards/cli.py: await dffml.run_dataflow.run_custom( +alice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow_ctx, {"repo": repo}, +alice/please/contribute/recommended_community_standards/cli.py- ) +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0008/index.md b/docs/discussions/alice_engineering_comms/0008/index.md new file mode 100644 index 0000000000..6e546482f8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0008/index.md @@ -0,0 +1,10 @@ +# 2022-07-27 Engineering Logs + +- References + - kaniko coder k3d digitalocean + - The following were issues with kind which might also effect us + - https://github.com/GoogleContainerTools/kaniko/issues/2164 + - https://github.com/tektoncd/pipeline/commit/6542823c8330581fcfe6ba5a8ea7682a06510bcb + - It doesn't look like kaniko currently supports multi context builds + - Great example of communication and meeting procedures link to code + - https://lists.spdx.org/g/Spdx-tech/message/4699 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0008/reply_0000.md b/docs/discussions/alice_engineering_comms/0008/reply_0000.md new file mode 100644 index 0000000000..1865375f40 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0008/reply_0000.md @@ -0,0 +1,351 @@ +## 2022-07-27 @pdxjohnny Engineering Logs + +- TODO + - [ ] Get involved in SCITT + - [x] Meetings + - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit# + - Weekly Monday at 8 AM Pacific + - Joining today + - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09 + - [x] Mailing list + - https://www.ietf.org/mailman/listinfo/scitt + - https://mailarchive.ietf.org/arch/browse/scitt/ + - [ ] Slack + - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/ + - Going to email Orie Steele orie (at) transmute.industries to ask for an invite. + - [x] Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting + - [ ] Follow up with OneAPI folks + - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages + - [ ] Associated tutorial + - [ ] Linked from `README` + - [ ] Finish out `alice please contribute recommended community standards` + dynamic opimp for meta issue body creation + - [ ] Associated tutorial + - [ ] Linked from `README` and `CONTRIBUTING` + - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it. + - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt) + +### Refactoring and Thinking About Locking of Repos for Contributions + +- Metadata + - Date: 2022-07-27 20:30 UTC -7 +- Saving this diff which was some work on dynamic application of overlay + so as to support fixup of the OpImp for `meta_issue_body()`'s inputs. + - We are going to table this for now for time reasons, but if someone + wants to pick it up before @pdxjohnny is back in September, please + give it a go (create an issue). +- Noticed that we have an issue with adding new files and locking. The current + lock is on the `git_repository/GitRepoSpec`. + - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo` +- `alice` + - Goal: Display Alice and software analysis trinity + - https://free-images.com/search/?q=alice%27s+adventures+in+wonderland&cat=st + - https://free-images.com/display/de_alices_abenteuer_im_43.html + - https://github.com/KhorSL/ASCII-ART + - Completed in d067273f8571b6a56733336663aaebc3acb3a701 + +![alice looking up](https://user-images.githubusercontent.com/5950433/181431145-18cfc8a7-28c8-486f-80f9-8b250e0b0943.png) + +```console +$ python ascii_art.py /mnt/c/Users/Johnny/Downloads/alice-looking-up-white-background.png +``` + +```console +$ alice +usage: alice [-h] [-log LOG] {please,shouldi,threats} ... + + .,*&&888@@#&:, + .:&::,...,:&#@@@#:. + .o,. ..:8@@#@@+ + .8o+,+o*+*+,+:&#@@#8@@. + &8&###@#&..*:8#@@#@#@@&+. + ,@:#@##@@8,:&#@@@###@88@@. + ,#@8&#@@@#o:#@@@@#8#@#8+&#. + +8####@@@@###@@@888#@@@#oo#. + .*8@###@@@@@@@@@#o*#@@#@@#8o@, + +###@#o8&#@@##8::##@@@&&#@8#&+ + o@8&#&##::.,o&+88#&8##8*@@#@#, + .##888&&oo#&o8###8&o##8##&####8, + .&#@8&:+o+&@@@#8#&8:8@@@@@#8@@@oo+ + ,&&#@##oo+*:@###X,@@@@#@o&##&8#@o,. + ,#&###@@8:*,#o&@@@@##:&#@###*.&o++o#@@#&+ + o8&8o8@#8+,,#.88#@#&@&&#@##++*&#o&&&#@@@@. + *88:,#8&#,o+:+@&8#:8@8&8#@@&o++,*++*+:#@@*. + .+#:o###@8o&8*@o&o8@o888@@@o+:o*&&,@#:&@@@, + *+&@8&#@o#8+8*#+8#+88@@@@@@&@###8##@8:*, + +o.@##@@@&88@*8@:8@@@@@@:.. ,8@:++. + +&++8@@@@##@@@@@@@@@@@+ 88 + &. *@8@:+##o&888#@@@, .#+ + &. ,@+o,.::+*+*:&#&, ,@. + &. .@8*,. ,*+++.+* :8+ + :+ .#@::. .8:.:** .8@@o, + .o. #@+ :@,.&* .:@@@@@@8**. + +&. :@o,+.*o,*, .*@@@@@@@@@@#o + .*:&o. 8@o:,*:, .o@@#8&&@@@@#@@@* + ,*:+:::o.*&8+,++ ,&@@#: * :@@88@@@#:. + ,::**:o:.,&*+*8: *8@@##o *,.8@@#8#@#@#+ + *:+*&o8:. ,o,o:8@+o@@88:*@+ +: +#@#####8##&. + ,:&::88&, .&:#o#@@@#,+&&*#&. .:,.&#@#88#####&, + +::o+&8:. :##88@@@@:.:8o+&8&. .. +8###&8&##&88* + .:*+*.8#: ,o*.+&@@#@8,,o8*+8##+ .+#8##8&⊸:. + ,:o., . .:8*. .o, &#,*:8:+,&*:, .8@@#o&&##8:. + .*o.*,+o8#* +8&, .::. .88.+:8o: ,+:, ,o#@#8&o8##+ + +o, .+,,o#8+,8@o**.,o*, :8o +*8#* +&, ,*o@@#@&8&oo8&:, + oo*+,,,*8@#..&@8:**:oo+. +8#* *+#@:...oo+ .**:8@@@ooo&:&o##+ + ::+..,++#@,.:##o&o**,....oo#++#8#@:.,:8&:.....*&@@#:oo*&oo&#@* + .+**:*8@o,+##&o:+,,,+,,o*8#,,8@#@:,,+*o*++,,,,+&#@8*8o88&::*. .,,,,,++, + ..8@++#@#88:,,,.,,,:+#&,,#@@#:,,.,&o*,.+++*:#@8+:*+. ......,:+*&,,..... + +:&8#@@##8&+,,,***@&,.8@@@*,,,.:o8&o&*o&o&o. .,.****::*:o*:o*o+,. + ...,*:*o&&o*8@@&o8@@@8+,,+:&&:+,... ,++*&oo&8&&&oo#@##8#&8:. + o@#@@@@#@@@@@@@,..... ..,,.+*::o#@##@##@#@#########@@@8:,. + ,@##@@88#@@@@@8 .:***oo*#8###8#@#@#@#@####@#@###@@#8&#: + 8+.,8+..,*o#@+ ,o+o88&88###@8#######@8#8#88#8#88##88#& + *o *+ #8 . ,*o&#@##@@@@@@@@@######8#888&&oo:8: + 8, ,& +@* .ooo&#@@@@@#@@@@@@####@##8#8##oo:o&:, + +& &, .@#. .:8#@@@@@@@@@@##8#####8#o&*:8&&8: + o* ,o o@& +o#@@@@@@@@#o&o88:&+ooo&:*::o:o&**o.:*+ + .8. 8.,o#8 .+&#@@@@@@@@&o+,::*+*:+:, ,. ,.. .,. ,. + 8. 8.,.&@:*:&@@@@@@@@8o+, ,. + :@o:#,,o8&:o&@@@@#&:+. + .@@@@@@@@@@@#8&o+, + ,*:&#@#&o*,.. + + /\ + / \ + Intent + / \ + / \ + / \ + / \ + / \ + / Alice is Here \ + / \ + / \ + /______________________\ + + Dynamic Analysis Static Analysis + + Alice's source code: https://github.com/intel/dffml/tree/alice/entities/alice + How we built Alice: https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice + How to extend Alice: https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst + Comment to get involved: https://github.com/intel/dffml/discussions/1406 + + +positional arguments: + {please,shouldi,threats} + +optional arguments: + -h, --help show this help message and exit + -log LOG Logging Level +``` + +- TODO + - [ ] Auto fork repo before push + - [ ] Update origin to push to + - [ ] Create PR + - [ ] Update README to fix demos + - [ ] Update CONTRIBUTING with tutorial on adding + `CONTRIBUTING.md` check and contribution + +**entities/alice/alice/timelines.py** + +```python +""" +Helpers for the timelines we support +""" + +# Trinity Day 0 +ALICE_DAY_0_GREGORIAN = datetime.datetime(2022, 4, 16) + +def date_alice_from_gregorian(date: str) -> int: + # TODO + return ALICE_DAY_0_GREGORIAN +``` + +```diff +diff --git a/dffml/base.py b/dffml/base.py +index fea0ef7220..9d6cd886fa 100644 +--- a/dffml/base.py ++++ b/dffml/base.py +@@ -237,6 +237,7 @@ def convert_value(arg, value, *, dataclass=None): + # before checking if the value is an instance of that + # type. Since it doesn't make sense to check if the + # value is an instance of something that's not a type. ++ print(possible_type, value) + if isinstance(possible_type, type) and isinstance( + value, possible_type + ): +diff --git a/dffml/df/system_context/system_context.py b/dffml/df/system_context/system_context.py +index e055a343f1..063547ad0c 100644 +--- a/dffml/df/system_context/system_context.py ++++ b/dffml/df/system_context/system_context.py +@@ -90,11 +90,11 @@ class SystemContextConfig: + # links: 'SystemContextConfig' + overlay: Union["SystemContextConfig", DataFlow] = field( + "The overlay we will apply with any overlays to merge within it (see default overlay usage docs)", +- default=APPLY_INSTALLED_OVERLAYS, ++ default=None, + ) + orchestrator: Union["SystemContextConfig", BaseOrchestrator] = field( + "The system context who's default flow will be used to produce an orchestrator which will be used to execute this system context including application of overlays", +- default_factory=lambda: MemoryOrchestrator, ++ default=None, + ) + + +@@ -131,6 +131,7 @@ class SystemContext(BaseDataFlowFacilitatorObject): + ) + # TODO(alice) Apply overlay + if self.config.overlay not in (None, APPLY_INSTALLED_OVERLAYS): ++ print(self.config.overlay) + breakpoint() + raise NotImplementedError( + "Application of overlays within SystemContext class entry not yet supported" +diff --git a/dffml/high_level/dataflow.py b/dffml/high_level/dataflow.py +index d180b5c302..d595ae1cb4 100644 +--- a/dffml/high_level/dataflow.py ++++ b/dffml/high_level/dataflow.py +@@ -206,12 +206,25 @@ async def run( + # the of the one that got passed in and the overlay. + if inspect.isclass(overlay): + overlay = overlay() ++ # TODO Move this into Overlay.load. Create a system context to ++ # execute the overlay if it is not already. ++ known_overlay_types = (DataFlow, SystemContext) ++ if not isinstance(overlay, known_overlay_types): ++ raise NotImplementedError(f"{overlay} is not a known type {known_overlay_types}") ++ if isinstance(overlay, DataFlow): ++ overlay = SystemContext( ++ upstream=overlay, ++ ) + # TODO(alice) overlay.deployment("native.python.overlay.apply") + apply_overlay = overlay.deployment() + async for _ctx, result in apply_overlay( + dataflow=dataflow, + ): ++ print("FEEDFACE", _ctx, result) ++ breakpoint() ++ return + continue ++ + # TODO + resultant_system_context = SystemContext( + upstream=result["overlays_merged"], overlay=None, +diff --git a/dffml/overlay/overlay.py b/dffml/overlay/overlay.py +index 13a50d9c10..0a01d38de9 100644 +--- a/dffml/overlay/overlay.py ++++ b/dffml/overlay/overlay.py +@@ -124,7 +124,7 @@ DFFML_MAIN_PACKAGE_OVERLAY = DataFlow( + stage=Stage.OUTPUT, + inputs={ + "merged": DataFlowAfterOverlaysMerged, +- "dataflow_we_are_applying_overlays_to_by_running_overlay_dataflow_and_passing_as_an_input": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput, ++ "upstream": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput, + }, + outputs={"overlayed": DataFlowAfterOverlaysApplied,}, + multi_output=False, +@@ -208,15 +208,12 @@ merge_implementations( + DFFML_OVERLAYS_INSTALLED.update(auto_flow=True) + + # Create Class for calling operations within the System Context as methods +-DFFMLOverlaysInstalled = SystemContext.subclass( +- "DFFMLOverlaysInstalled", +- { +- "upstream": {"default_factory": lambda: DFFML_OVERLAYS_INSTALLED}, +- # TODO(alice) We'll need to make sure we have code to instantiate and +- # instance of a class if only a class is given an not an instance. +- "overlay": {"default_factory": lambda: None}, +- "orchestrator": {"default_factory": lambda: MemoryOrchestrator()}, +- }, ++DFFMLOverlaysInstalled = SystemContext( ++ upstream=DFFML_OVERLAYS_INSTALLED, ++ # TODO(alice) We'll need to make sure we have code to instantiate and ++ # instance of a class if only a class is given an not an instance. ++ overlay=None, ++ orchestrator=MemoryOrchestrator(), + ) + + # Callee +diff --git a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py +index 46d20c8c85..fff5d4928b 100644 +--- a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py ++++ b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py +@@ -18,6 +18,14 @@ from ....recommended_community_standards import AliceGitRepo, AlicePleaseContrib + from ....dffml.operations.git.contribute import AlicePleaseContributeRecommendedCommunityStandardsOverlayGit + + ++GitHubIssue = NewType("GitHubIssue", str) ++ ++ ++@dataclasses.dataclass ++class RecommendedCommunityStandardContribution: ++ path: pathlib.Path ++ issue: GitHubIssue ++ + + class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue: + """ +@@ -39,6 +47,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue: + MetaIssueTitle = NewType("MetaIssueTitle", str) + MetaIssueBody = NewType("MetaIssueBody", str) + ++ # TODO This should only be run if there is a need for a README + # body: Optional['ContributingIssueBody'] = "References:\n- https://docs.github.com/articles/setting-guidelines-for-repository-contributors/", + async def readme_issue( + self, +@@ -79,13 +88,40 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue: + """ + ).lstrip() + +- # TODO(alice) There is a bug with Optional which can be revield by use here ++ + @staticmethod ++ async def readme_contribution( ++ issue: "ReadmeIssue", ++ path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath, ++ ) -> RecommendedCommunityStandardContribution: ++ return RecommendedCommunityStandardContribution( ++ path=path, ++ issue=issue, ++ ) ++ ++ ++ """ ++ @dffml.op( ++ stage=dffml.Stage.OUTPUT, ++ ) ++ async def collect_recommended_community_standard_contributions( ++ self, ++ ) -> List[RecommendedCommunityStandardContribution]: ++ async with self.octx.ictx.definitions(self.ctx) as od: ++ return [item async for item in od.inputs(RecommendedCommunityStandardContribution)] ++ """ ++ ++ ++ # TODO(alice) There is a bug with Optional which can be revield by use here + def meta_issue_body( + repo: AliceGitRepo, + base: AlicePleaseContributeRecommendedCommunityStandardsOverlayGit.BaseBranch, +- readme_path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath, +- readme_issue: ReadmeIssue, ++ # recommended_community_standard_contributions: List[RecommendedCommunityStandardContribution], ++ # TODO On @op inspect paramter if Collect is found on an input, wrap the ++ # operation in a subflow and add a generic version of ++ # collect_recommended_community_standard_contributions to the flow as an ++ # autostart or triggered via auto start operation. ++ # recommended_community_standard_contributions: Collect[List[RecommendedCommunityStandardContribution]], + ) -> "MetaIssueBody": + """ + >>> AlicePleaseContributeRecommendedCommunityStandardsGitHubIssueOverlay.meta_issue_body( +@@ -98,6 +134,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue: + - [] [License](https://github.com/intel/dffml/blob/main/LICENSE) + - [] Security + """ ++ readme_issue, readme_path = recommended_community_standard_contributions[0] + return "\n".join( + [ + "- [" +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0009/index.md b/docs/discussions/alice_engineering_comms/0009/index.md new file mode 100644 index 0000000000..5b091290ab --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0009/index.md @@ -0,0 +1 @@ +# 2022-07-28 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0009/reply_0000.md b/docs/discussions/alice_engineering_comms/0009/reply_0000.md new file mode 100644 index 0000000000..e00ad1f60d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0009/reply_0000.md @@ -0,0 +1,519 @@ +## 2022-07-28 @pdxjohnny Engineering Logs + +- TODO + - [ ] Get involved in SCITT + - [x] Meetings + - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit# + - Weekly Monday at 8 AM Pacific + - Joining today + - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09 + - [x] Mailing list + - https://www.ietf.org/mailman/listinfo/scitt + - https://mailarchive.ietf.org/arch/browse/scitt/ + - [ ] Slack + - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/ + - Going to email Orie Steele orie (at) transmute.industries to ask for an invite. + - [x] Kick off OSS scans + - Targeting collaboration with CRob on metrics insertion to OpenSSF DB + - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.) + - Generate template for auto creation to fill every meeting / fillable pre-meeting + - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages + - [ ] Associated tutorial + - [ ] Linked from `README` + - [ ] Finish out `alice please contribute recommended community standards` + dynamic opimp for meta issue body creation + - [ ] Associated tutorial + - [ ] Linked from `README` and `CONTRIBUTING` + - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it. + - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt) +- Noticed that we have an issue with adding new files and locking. The current + lock is on the `git_repository/GitRepoSpec`. + - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo` + - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed + - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1 + +### 43 + +``` +(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'))) +(Pdb) custom_run_dataflow_ctx.config.dataflow.seed +[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)] +``` + +- Attempting to figure out why an operation is not being called + - `contribute_readme_md` should be getting `base`, but is not. + +``` +{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], + ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], + ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], + github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)], + repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], + repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README + +Closes: https://github.com/pdxjohnny/testaaaa/issues/108 +, definition=repo.readme.git.commit.message)], + writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}, + 'alternate_definitions': [], + 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], + ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], + ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], + ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], + ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README + +Closes: https://github.com/pdxjohnny/testaaaa/issues/108 +, definition=repo.readme.git.commit.message)], + ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], + ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, + 'check_for_default_value': [repo.git.base.branch], + 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README + +Closes: https://github.com/pdxjohnny/testaaaa/issues/108 +, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README + +Closes: https://github.com/pdxjohnny/testaaaa/issues/108 +, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})], + 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), + 'dataflow': , + 'definition': repo.git.base.branch, + 'gather': {'base': [], + 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]}, + 'handle_string': "Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', " + "URL='https://github.com/pdxjohnny/testaaaa'), " + 'definition=ReadmeGitRepo)', + 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]), + 'input_name': 'base', + 'input_source': 'seed', + 'input_sources': ['seed'], + 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), + 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0), + 'origin': 'seed', + 'origins': ['seed'], + 'pprint': , + 'rctx': , + 'self': } +> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs() +-> return +(Pdb) gather +{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []} +(Pdb) operation.inputs +{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message} +(Pdb) self.ctxhd.keys() +dict_keys(["Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)"]) +(Pdb) from pprint import pprint +(Pdb) pprint(inputs.definitions) +{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], + ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], + ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], + github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)], + repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], + repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README + +Closes: https://github.com/pdxjohnny/testaaaa/issues/108 +, definition=repo.readme.git.commit.message)], + writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]} +(Pdb) gather +{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []} +(Pdb) operation.inputs +{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message} +``` + +- Suspect discarded because of mismatched origin, if not that, will check definition + - Found out that it was seed vs. output origin mismatch + - Found out that BaseBranch comes from OverlayGit + - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded + - This way `auto_flow` will make the expected origin the output from OverlayGit operations + rather than seed (the default when no matching outputs are seen on DataFlow init). + - We found it created an infinite loop + - Will try reusing redundancy checker, that seems to be doing well +- https://github.com/intel/dffml/issues/1408 +- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier + on subflow start to be present, must be something else. + - The logs tell us that alice_contribute_readme is returning `None`, which means + that the downstream operation is not called, since None means no return value + in this case. + +``` +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None +``` + +- Future + - `run_custom` Optionally support forward subflow +- TODO + - [ ] Set definition proprety `AliceGitRepo.lock` to `True` + + +### 44 + +- Found out that util: subprocess: run command events: Do not return after yield of stdout/err + - Fixed in b6eea6ed4549f9e7a89aab6306a51213b2bf36c9 + +```console +$ (for i in $(echo determin_base_branch readme_pr_body contribute_readme_md github_owns_remote alice_contribute_readme); do grep -rn "${i} Outputs" .output/2022-07-28-14-11.txt; done) | sort | uniq | sort +354:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote Outputs: {'result': 'origin'} +361:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Outputs: {'result': 'master'} +450:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body Outputs: {'result': 'Closes: https://github.com/pdxjohnny/testaaaa/issues/188'} +472:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'} +479:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None +``` + +``` +(Pdb) +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'} +(Pdb) pprint(readme_dataflow.flow['alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr'].inputs) +{'base': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch': 'result'}], + 'body': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body': 'result'}], + 'head': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md': 'result'}], + 'origin': ['seed'], + 'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], + 'title': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_title': 'result'}]} +``` + +- origin is set to seed + - `'origin': ['seed']` was there because `OverlayGitHub.github_owns_remote` is not in the flow + - We forgot add it to `entry_points.txt`, added + +```console +$ dffml service dev export alice.cli:AlicePleaseContributeCLIDataFlow | tee alice.please.contribute.recommended_community_standards.json +$ (echo -e 'HTTP/1.0 200 OK\n' && dffml dataflow diagram -shortname alice.please.contribute.recommended_community_standards.json) | nc -Nlp 9999; +``` + +- Opens + - `guessed_repo_string_means_no_git_branch_given` is feeding `git_repo_default_branch` but `dffml dataflow diagram` just have a bug because it's not showing the connection. + +```mermaid +graph TD +subgraph a759a07029077edc5c37fea0326fa281[Processing Stage] +style a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a +subgraph 8cfb8cd5b8620de4a7ebe0dfec00771a[cli_has_repos] +style 8cfb8cd5b8620de4a7ebe0dfec00771a fill:#fff4de,stroke:#cece71 +d493c90433d19f11f33c2d72cd144940[cli_has_repos] +e07552ee3b6b7696cb3ddd786222eaad(cmd) +e07552ee3b6b7696cb3ddd786222eaad --> d493c90433d19f11f33c2d72cd144940 +cee6b5fdd0b6fbd0539cdcdc7f5a3324(wanted) +cee6b5fdd0b6fbd0539cdcdc7f5a3324 --> d493c90433d19f11f33c2d72cd144940 +79e1ea6822bff603a835fb8ee80c7ff3(result) +d493c90433d19f11f33c2d72cd144940 --> 79e1ea6822bff603a835fb8ee80c7ff3 +end +subgraph 0c2b64320fb5666a034794bb2195ecf0[cli_is_asking_for_recommended_community_standards] +style 0c2b64320fb5666a034794bb2195ecf0 fill:#fff4de,stroke:#cece71 +222ee6c0209f1f1b7a782bc1276868c7[cli_is_asking_for_recommended_community_standards] +330f463830aa97e88917d5a9d1c21500(cmd) +330f463830aa97e88917d5a9d1c21500 --> 222ee6c0209f1f1b7a782bc1276868c7 +ba29b52e9c5aa88ea1caeeff29bfd491(result) +222ee6c0209f1f1b7a782bc1276868c7 --> ba29b52e9c5aa88ea1caeeff29bfd491 +end +subgraph eac58e8db2b55cb9cc5474aaa402c93e[cli_is_meant_on_this_repo] +style eac58e8db2b55cb9cc5474aaa402c93e fill:#fff4de,stroke:#cece71 +6c819ad0228b0e7094b33e0634da9a38[cli_is_meant_on_this_repo] +dc7c5f0836f7d2564c402bf956722672(cmd) +dc7c5f0836f7d2564c402bf956722672 --> 6c819ad0228b0e7094b33e0634da9a38 +58d8518cb0d6ef6ad35dc242486f1beb(wanted) +58d8518cb0d6ef6ad35dc242486f1beb --> 6c819ad0228b0e7094b33e0634da9a38 +135ee61e3402d6fcbd7a219b0b4ccd73(result) +6c819ad0228b0e7094b33e0634da9a38 --> 135ee61e3402d6fcbd7a219b0b4ccd73 +end +subgraph 37887bf260c5c8e9bd18038401008bbc[cli_run_on_repo] +style 37887bf260c5c8e9bd18038401008bbc fill:#fff4de,stroke:#cece71 +9d1042f33352800e54d98c9c5a4223df[cli_run_on_repo] +e824ae072860bc545fc7d55aa0bca479(repo) +e824ae072860bc545fc7d55aa0bca479 --> 9d1042f33352800e54d98c9c5a4223df +40109d487bb9f08608d8c5f6e747042f(result) +9d1042f33352800e54d98c9c5a4223df --> 40109d487bb9f08608d8c5f6e747042f +end +subgraph 66ecd0c1f2e08941c443ec9cd89ec589[guess_repo_string_is_directory] +style 66ecd0c1f2e08941c443ec9cd89ec589 fill:#fff4de,stroke:#cece71 +737d719a0c348ff65456024ddbc530fe[guess_repo_string_is_directory] +33d806f9b732bfd6b96ae2e9e4243a68(repo_string) +33d806f9b732bfd6b96ae2e9e4243a68 --> 737d719a0c348ff65456024ddbc530fe +dd5aab190ce844673819298c5b8fde76(result) +737d719a0c348ff65456024ddbc530fe --> dd5aab190ce844673819298c5b8fde76 +end +subgraph 4ea6696419c4a0862a4f63ea1f60c751[create_branch_if_none_exists] +style 4ea6696419c4a0862a4f63ea1f60c751 fill:#fff4de,stroke:#cece71 +502369b37882b300d6620d5b4020f5b2[create_branch_if_none_exists] +fdcb9b6113856222e30e093f7c38065e(name) +fdcb9b6113856222e30e093f7c38065e --> 502369b37882b300d6620d5b4020f5b2 +bdcf4b078985f4a390e4ed4beacffa65(repo) +bdcf4b078985f4a390e4ed4beacffa65 --> 502369b37882b300d6620d5b4020f5b2 +5a5493ab86ab4053f1d44302e7bdddd6(result) +502369b37882b300d6620d5b4020f5b2 --> 5a5493ab86ab4053f1d44302e7bdddd6 +end +subgraph b1d510183f6a4c3fde207a4656c72cb4[determin_base_branch] +style b1d510183f6a4c3fde207a4656c72cb4 fill:#fff4de,stroke:#cece71 +476aecd4d4d712cda1879feba46ea109[determin_base_branch] +ff47cf65b58262acec28507f4427de45(default_branch) +ff47cf65b58262acec28507f4427de45 --> 476aecd4d4d712cda1879feba46ea109 +150204cd2d5a921deb53c312418379a1(result) +476aecd4d4d712cda1879feba46ea109 --> 150204cd2d5a921deb53c312418379a1 +end +subgraph 2a08ff341f159c170b7fe017eaad2f18[git_repo_to_alice_git_repo] +style 2a08ff341f159c170b7fe017eaad2f18 fill:#fff4de,stroke:#cece71 +7f74112f6d30c6289caa0a000e87edab[git_repo_to_alice_git_repo] +e58180baf478fe910359358a3fa02234(repo) +e58180baf478fe910359358a3fa02234 --> 7f74112f6d30c6289caa0a000e87edab +9b92d5a346885079a2821c4d27cb5174(result) +7f74112f6d30c6289caa0a000e87edab --> 9b92d5a346885079a2821c4d27cb5174 +end +subgraph b5d35aa8a8dcd28d22d47caad02676b0[guess_repo_string_is_url] +style b5d35aa8a8dcd28d22d47caad02676b0 fill:#fff4de,stroke:#cece71 +0de074e71a32e30889b8bb400cf8db9f[guess_repo_string_is_url] +c3bfe79b396a98ce2d9bfe772c9c20af(repo_string) +c3bfe79b396a98ce2d9bfe772c9c20af --> 0de074e71a32e30889b8bb400cf8db9f +2a1c620b0d510c3d8ed35deda41851c5(result) +0de074e71a32e30889b8bb400cf8db9f --> 2a1c620b0d510c3d8ed35deda41851c5 +end +subgraph 60791520c6d124c0bf15e599132b0caf[guessed_repo_string_is_operations_git_url] +style 60791520c6d124c0bf15e599132b0caf fill:#fff4de,stroke:#cece71 +102f173505d7b546236cdeff191369d4[guessed_repo_string_is_operations_git_url] +4934c6211334318c63a5e91530171c9b(repo_url) +4934c6211334318c63a5e91530171c9b --> 102f173505d7b546236cdeff191369d4 +8d0adc31da1a0919724baf73d047743c(result) +102f173505d7b546236cdeff191369d4 --> 8d0adc31da1a0919724baf73d047743c +end +subgraph f2c7b93622447999daab403713239ada[guessed_repo_string_means_no_git_branch_given] +style f2c7b93622447999daab403713239ada fill:#fff4de,stroke:#cece71 +c8294a87e7aae8f7f9cb7f53e054fed5[guessed_repo_string_means_no_git_branch_given] +5567dd8a6d7ae4fe86252db32e189a4d(repo_url) +5567dd8a6d7ae4fe86252db32e189a4d --> c8294a87e7aae8f7f9cb7f53e054fed5 +d888e6b64b5e3496056088f14dab9894(result) +c8294a87e7aae8f7f9cb7f53e054fed5 --> d888e6b64b5e3496056088f14dab9894 +end +subgraph 113addf4beee5305fdc79d2363608f9d[github_owns_remote] +style 113addf4beee5305fdc79d2363608f9d fill:#fff4de,stroke:#cece71 +049b72b81b976fbb43607bfeeb0464c5[github_owns_remote] +6c2b36393ffff6be0b4ad333df2d9419(remote) +6c2b36393ffff6be0b4ad333df2d9419 --> 049b72b81b976fbb43607bfeeb0464c5 +19a9ee483c1743e6ecf0a2dc3b6f8c7a(repo) +19a9ee483c1743e6ecf0a2dc3b6f8c7a --> 049b72b81b976fbb43607bfeeb0464c5 +b4cff8d194413f436d94f9d84ece0262(result) +049b72b81b976fbb43607bfeeb0464c5 --> b4cff8d194413f436d94f9d84ece0262 +end +subgraph 43a22312a3d4f5c995c54c5196acc50a[create_meta_issue] +style 43a22312a3d4f5c995c54c5196acc50a fill:#fff4de,stroke:#cece71 +d2345f23e5ef9f54c591c4a687c24575[create_meta_issue] +1d79010ee1550f057c531130814c40b9(body) +1d79010ee1550f057c531130814c40b9 --> d2345f23e5ef9f54c591c4a687c24575 +712d4318e59bd2dc629f0ddebb257ca3(repo) +712d4318e59bd2dc629f0ddebb257ca3 --> d2345f23e5ef9f54c591c4a687c24575 +38a94f1c2162803f571489d707d61021(title) +38a94f1c2162803f571489d707d61021 --> d2345f23e5ef9f54c591c4a687c24575 +2b22b4998ac3e6a64d82e0147e71ee1b(result) +d2345f23e5ef9f54c591c4a687c24575 --> 2b22b4998ac3e6a64d82e0147e71ee1b +end +subgraph f77af509c413b86b6cd7e107cc623c73[meta_issue_body] +style f77af509c413b86b6cd7e107cc623c73 fill:#fff4de,stroke:#cece71 +69a9852570720a3d35cb9dd52a281f71[meta_issue_body] +480d1cc478d23858e92d61225349b674(base) +480d1cc478d23858e92d61225349b674 --> 69a9852570720a3d35cb9dd52a281f71 +37035ea5a06a282bdc1e1de24090a36d(readme_issue) +37035ea5a06a282bdc1e1de24090a36d --> 69a9852570720a3d35cb9dd52a281f71 +fdf0dbb8ca47ee9022b3daeb8c7df9c0(readme_path) +fdf0dbb8ca47ee9022b3daeb8c7df9c0 --> 69a9852570720a3d35cb9dd52a281f71 +428ca84f627c695362652cc7531fc27b(repo) +428ca84f627c695362652cc7531fc27b --> 69a9852570720a3d35cb9dd52a281f71 +0cd9eb1ffb3c56d2b0a4359f800b1f20(result) +69a9852570720a3d35cb9dd52a281f71 --> 0cd9eb1ffb3c56d2b0a4359f800b1f20 +end +subgraph 8506cba6514466fb6d65f33ace4b0eac[alice_contribute_readme] +style 8506cba6514466fb6d65f33ace4b0eac fill:#fff4de,stroke:#cece71 +d4507d3d1c3fbf3e7e373eae24797667[alice_contribute_readme] +68cf7d6869d027ca46a5fb4dbf7001d1(repo) +68cf7d6869d027ca46a5fb4dbf7001d1 --> d4507d3d1c3fbf3e7e373eae24797667 +2f9316539862f119f7c525bf9061e974(result) +d4507d3d1c3fbf3e7e373eae24797667 --> 2f9316539862f119f7c525bf9061e974 +end +subgraph 4233e6dc67cba131d4ef005af9c02959[contribute_readme_md] +style 4233e6dc67cba131d4ef005af9c02959 fill:#fff4de,stroke:#cece71 +3db0ee5d6ab83886bded5afd86f3f88f[contribute_readme_md] +37044e4d8610abe13849bc71a5cb7591(base) +37044e4d8610abe13849bc71a5cb7591 --> 3db0ee5d6ab83886bded5afd86f3f88f +631c051fe6050ae8f8fc3321ed00802d(commit_message) +631c051fe6050ae8f8fc3321ed00802d --> 3db0ee5d6ab83886bded5afd86f3f88f +182194bab776fc9bc406ed573d621b68(repo) +182194bab776fc9bc406ed573d621b68 --> 3db0ee5d6ab83886bded5afd86f3f88f +0ee9f524d2db12be854fe611fa8126dd(result) +3db0ee5d6ab83886bded5afd86f3f88f --> 0ee9f524d2db12be854fe611fa8126dd +end +subgraph a6080d9c45eb5f806a47152a18bf7830[create_readme_file_if_not_exists] +style a6080d9c45eb5f806a47152a18bf7830 fill:#fff4de,stroke:#cece71 +67e388f508dd96084c37d236a2c67e67[create_readme_file_if_not_exists] +54faf20bfdca0e63d07efb3e5a984cf1(readme_contents) +54faf20bfdca0e63d07efb3e5a984cf1 --> 67e388f508dd96084c37d236a2c67e67 +8c089c362960ccf181742334a3dccaea(repo) +8c089c362960ccf181742334a3dccaea --> 67e388f508dd96084c37d236a2c67e67 +5cc65e17d40e6a7223c1504f1c4b0d2a(result) +67e388f508dd96084c37d236a2c67e67 --> 5cc65e17d40e6a7223c1504f1c4b0d2a +end +subgraph e7757158127e9845b2915c16a7fa80c5[readme_commit_message] +style e7757158127e9845b2915c16a7fa80c5 fill:#fff4de,stroke:#cece71 +562bdc535c7cebfc66dba920b1a17540[readme_commit_message] +0af5cbea9050874a0a3cba73bb61f892(issue_url) +0af5cbea9050874a0a3cba73bb61f892 --> 562bdc535c7cebfc66dba920b1a17540 +2641f3b67327fb7518ee34a3a40b0755(result) +562bdc535c7cebfc66dba920b1a17540 --> 2641f3b67327fb7518ee34a3a40b0755 +end +subgraph cf99ff6fad80e9c21266b43fd67b2f7b[readme_issue] +style cf99ff6fad80e9c21266b43fd67b2f7b fill:#fff4de,stroke:#cece71 +da44417f891a945085590baafffc2bdb[readme_issue] +d519830ab4e07ec391038e8581889ac3(body) +d519830ab4e07ec391038e8581889ac3 --> da44417f891a945085590baafffc2bdb +268852aa3fa8ab0864a32abae5a333f7(repo) +268852aa3fa8ab0864a32abae5a333f7 --> da44417f891a945085590baafffc2bdb +77a11dd29af309cf43ed321446c4bf01(title) +77a11dd29af309cf43ed321446c4bf01 --> da44417f891a945085590baafffc2bdb +1d2360c9da18fac0b6ec142df8f3fbda(result) +da44417f891a945085590baafffc2bdb --> 1d2360c9da18fac0b6ec142df8f3fbda +end +subgraph 7ec0442cf2d95c367912e8abee09b217[readme_pr] +style 7ec0442cf2d95c367912e8abee09b217 fill:#fff4de,stroke:#cece71 +bb314dc452cde5b6af5ea94dd277ba40[readme_pr] +127d77c3047facc1daa621148c5a0a1d(base) +127d77c3047facc1daa621148c5a0a1d --> bb314dc452cde5b6af5ea94dd277ba40 +cb421e4de153cbb912f7fbe57e4ad734(body) +cb421e4de153cbb912f7fbe57e4ad734 --> bb314dc452cde5b6af5ea94dd277ba40 +cbf7a0b88c0a41953b245303f3e9a0d3(head) +cbf7a0b88c0a41953b245303f3e9a0d3 --> bb314dc452cde5b6af5ea94dd277ba40 +e5f9ad44448abd2469b3fd9831f3d159(origin) +e5f9ad44448abd2469b3fd9831f3d159 --> bb314dc452cde5b6af5ea94dd277ba40 +a35aee6711d240378eb57a3932537ca1(repo) +a35aee6711d240378eb57a3932537ca1 --> bb314dc452cde5b6af5ea94dd277ba40 +dfcce88a7d605d46bf17de1159fbe5ad(title) +dfcce88a7d605d46bf17de1159fbe5ad --> bb314dc452cde5b6af5ea94dd277ba40 +a210a7890a7bea8d629368e02da3d806(result) +bb314dc452cde5b6af5ea94dd277ba40 --> a210a7890a7bea8d629368e02da3d806 +end +subgraph 227eabb1f1c5cc0bc931714a03049e27[readme_pr_body] +style 227eabb1f1c5cc0bc931714a03049e27 fill:#fff4de,stroke:#cece71 +2aea976396cfe68dacd9bc7d4a3f0cba[readme_pr_body] +c5dfd309617c909b852afe0b4ae4a178(readme_issue) +c5dfd309617c909b852afe0b4ae4a178 --> 2aea976396cfe68dacd9bc7d4a3f0cba +40ddb5b508cb5643e7c91f7abdb72b84(result) +2aea976396cfe68dacd9bc7d4a3f0cba --> 40ddb5b508cb5643e7c91f7abdb72b84 +end +subgraph 48687c84e69b3db0acca625cbe2e6b49[readme_pr_title] +style 48687c84e69b3db0acca625cbe2e6b49 fill:#fff4de,stroke:#cece71 +d8668ff93f41bc241c8c540199cd7453[readme_pr_title] +3b2137dd1c61d0dac7d4e40fd6746cfb(readme_issue) +3b2137dd1c61d0dac7d4e40fd6746cfb --> d8668ff93f41bc241c8c540199cd7453 +956e024fde513b3a449eac9ee42d6ab3(result) +d8668ff93f41bc241c8c540199cd7453 --> 956e024fde513b3a449eac9ee42d6ab3 +end +subgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL] +style d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71 +f577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL] +7440e73a8e8f864097f42162b74f2762(URL) +7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40 +8e39b501b41c5d0e4596318f80a03210(valid) +f577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210 +end +subgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo] +style af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71 +155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo] +eed77b9eea541e0c378c67395351099c(URL) +eed77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5 +8b5928cd265dd2c44d67d076f60c8b05(ssh_key) +8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5 +4e1d5ea96e050e46ebf95ebc0713d54c(repo) +155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c +6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL} +6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5 +end +subgraph d3d91578caf34c0ae944b17853783406[git_repo_default_branch] +style d3d91578caf34c0ae944b17853783406 fill:#fff4de,stroke:#cece71 +546062a96122df465d2631f31df4e9e3[git_repo_default_branch] +181f1b33df4d795fbad2911ec7087e86(repo) +181f1b33df4d795fbad2911ec7087e86 --> 546062a96122df465d2631f31df4e9e3 +57651c1bcd24b794dfc8d1794ab556d5(branch) +546062a96122df465d2631f31df4e9e3 --> 57651c1bcd24b794dfc8d1794ab556d5 +5ed1ab77e726d7efdcc41e9e2f8039c6(remote) +546062a96122df465d2631f31df4e9e3 --> 5ed1ab77e726d7efdcc41e9e2f8039c6 +4c3cdd5f15b7a846d291aac089e8a622{no_git_branch_given} +4c3cdd5f15b7a846d291aac089e8a622 --> 546062a96122df465d2631f31df4e9e3 +end +end +subgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage] +style a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a +end +subgraph 58ca4d24d2767176f196436c2890b926[Output Stage] +style 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a +end +subgraph inputs[Inputs] +style inputs fill:#f6dbf9,stroke:#a178ca +128516cfa09b0383023eab52ee24878a(seed
dffml.util.cli.CMD) +128516cfa09b0383023eab52ee24878a --> e07552ee3b6b7696cb3ddd786222eaad +ba29b52e9c5aa88ea1caeeff29bfd491 --> cee6b5fdd0b6fbd0539cdcdc7f5a3324 +128516cfa09b0383023eab52ee24878a(seed
dffml.util.cli.CMD) +128516cfa09b0383023eab52ee24878a --> 330f463830aa97e88917d5a9d1c21500 +128516cfa09b0383023eab52ee24878a(seed
dffml.util.cli.CMD) +128516cfa09b0383023eab52ee24878a --> dc7c5f0836f7d2564c402bf956722672 +ba29b52e9c5aa88ea1caeeff29bfd491 --> 58d8518cb0d6ef6ad35dc242486f1beb +79e1ea6822bff603a835fb8ee80c7ff3 --> e824ae072860bc545fc7d55aa0bca479 +135ee61e3402d6fcbd7a219b0b4ccd73 --> e824ae072860bc545fc7d55aa0bca479 +40109d487bb9f08608d8c5f6e747042f --> 33d806f9b732bfd6b96ae2e9e4243a68 +21ccfd2c550bd853d28581f0b0c9f9fe(seed
default.branch.name) +21ccfd2c550bd853d28581f0b0c9f9fe --> fdcb9b6113856222e30e093f7c38065e +dd5aab190ce844673819298c5b8fde76 --> bdcf4b078985f4a390e4ed4beacffa65 +9b92d5a346885079a2821c4d27cb5174 --> bdcf4b078985f4a390e4ed4beacffa65 +5a5493ab86ab4053f1d44302e7bdddd6 --> ff47cf65b58262acec28507f4427de45 +57651c1bcd24b794dfc8d1794ab556d5 --> ff47cf65b58262acec28507f4427de45 +4e1d5ea96e050e46ebf95ebc0713d54c --> e58180baf478fe910359358a3fa02234 +40109d487bb9f08608d8c5f6e747042f --> c3bfe79b396a98ce2d9bfe772c9c20af +2a1c620b0d510c3d8ed35deda41851c5 --> 4934c6211334318c63a5e91530171c9b +2a1c620b0d510c3d8ed35deda41851c5 --> 5567dd8a6d7ae4fe86252db32e189a4d +5ed1ab77e726d7efdcc41e9e2f8039c6 --> 6c2b36393ffff6be0b4ad333df2d9419 +dd5aab190ce844673819298c5b8fde76 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a +9b92d5a346885079a2821c4d27cb5174 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a +0cd9eb1ffb3c56d2b0a4359f800b1f20 --> 1d79010ee1550f057c531130814c40b9 +dd5aab190ce844673819298c5b8fde76 --> 712d4318e59bd2dc629f0ddebb257ca3 +9b92d5a346885079a2821c4d27cb5174 --> 712d4318e59bd2dc629f0ddebb257ca3 +e7ad3469d98c3bd160363dbc47e2d741(seed
MetaIssueTitle) +e7ad3469d98c3bd160363dbc47e2d741 --> 38a94f1c2162803f571489d707d61021 +150204cd2d5a921deb53c312418379a1 --> 480d1cc478d23858e92d61225349b674 +1d2360c9da18fac0b6ec142df8f3fbda --> 37035ea5a06a282bdc1e1de24090a36d +5cc65e17d40e6a7223c1504f1c4b0d2a --> fdf0dbb8ca47ee9022b3daeb8c7df9c0 +dd5aab190ce844673819298c5b8fde76 --> 428ca84f627c695362652cc7531fc27b +9b92d5a346885079a2821c4d27cb5174 --> 428ca84f627c695362652cc7531fc27b +dd5aab190ce844673819298c5b8fde76 --> 68cf7d6869d027ca46a5fb4dbf7001d1 +9b92d5a346885079a2821c4d27cb5174 --> 68cf7d6869d027ca46a5fb4dbf7001d1 +150204cd2d5a921deb53c312418379a1 --> 37044e4d8610abe13849bc71a5cb7591 +2641f3b67327fb7518ee34a3a40b0755 --> 631c051fe6050ae8f8fc3321ed00802d +2f9316539862f119f7c525bf9061e974 --> 182194bab776fc9bc406ed573d621b68 +d2708225c1f4c95d613a2645a17a5bc0(seed
repo.directory.readme.contents) +d2708225c1f4c95d613a2645a17a5bc0 --> 54faf20bfdca0e63d07efb3e5a984cf1 +2f9316539862f119f7c525bf9061e974 --> 8c089c362960ccf181742334a3dccaea +1d2360c9da18fac0b6ec142df8f3fbda --> 0af5cbea9050874a0a3cba73bb61f892 +1daacccd02f8117e67ad3cb8686a732c(seed
ReadmeIssueBody) +1daacccd02f8117e67ad3cb8686a732c --> d519830ab4e07ec391038e8581889ac3 +2f9316539862f119f7c525bf9061e974 --> 268852aa3fa8ab0864a32abae5a333f7 +0c1ab2d4bda10e1083557833ae5c5da4(seed
ReadmeIssueTitle) +0c1ab2d4bda10e1083557833ae5c5da4 --> 77a11dd29af309cf43ed321446c4bf01 +150204cd2d5a921deb53c312418379a1 --> 127d77c3047facc1daa621148c5a0a1d +40ddb5b508cb5643e7c91f7abdb72b84 --> cb421e4de153cbb912f7fbe57e4ad734 +0ee9f524d2db12be854fe611fa8126dd --> cbf7a0b88c0a41953b245303f3e9a0d3 +b4cff8d194413f436d94f9d84ece0262 --> e5f9ad44448abd2469b3fd9831f3d159 +2f9316539862f119f7c525bf9061e974 --> a35aee6711d240378eb57a3932537ca1 +956e024fde513b3a449eac9ee42d6ab3 --> dfcce88a7d605d46bf17de1159fbe5ad +1d2360c9da18fac0b6ec142df8f3fbda --> c5dfd309617c909b852afe0b4ae4a178 +1d2360c9da18fac0b6ec142df8f3fbda --> 3b2137dd1c61d0dac7d4e40fd6746cfb +8d0adc31da1a0919724baf73d047743c --> 7440e73a8e8f864097f42162b74f2762 +8d0adc31da1a0919724baf73d047743c --> eed77b9eea541e0c378c67395351099c +a6ed501edbf561fda49a0a0a3ca310f0(seed
git_repo_ssh_key) +a6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05 +8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce +4e1d5ea96e050e46ebf95ebc0713d54c --> 181f1b33df4d795fbad2911ec7087e86 +end +``` + +- As of f8619a6362251d04929f4bfa395882b3257a3776 it works without meta issue + creation: https://github.com/pdxjohnny/testaaaa/pull/193 + +# 45 + +```console +$ gif-for-cli --rows $(tput lines) --cols $(tput cols) --export=/mnt/c/Users/Johnny/Downloads/alice-search-alices-adventures-in-wonderland-1.gif "Alice's Adventures in Wonderland" +``` + +```console +$ watch -n 0.2 'grep FEEDFACE .output/$(ls .output/ | tail -n 1) | sed -e "s/alice.please.contribute.recommended_community_standards.recommended_community_standards.//g" | grep -i repo' +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0010/index.md b/docs/discussions/alice_engineering_comms/0010/index.md new file mode 100644 index 0000000000..56de4644c6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0010/index.md @@ -0,0 +1,16 @@ +# 2022-07-29 Engineering Logs + +- Alice PR: https://github.com/intel/dffml/pull/1401 +- John's last day before sabbatical + - He will be in town but offline until 2022-08-29 +- Rolling Alice: 2022 Progress Reports: July Activities Recap: https://youtu.be/JDh2DARl8os +- Alice is ready for contribution + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md + - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst + - Self fulfilling prophecy again! We can even automate our contributions to her even if we wanted to! She will eventually! :P +- IETF + - Joined SCITT WG, will rejoin in September, others please do as well! +- OpenSSF + - Aligned with Identifying Security Threats WG on SCITT looking like a solid direction to cross with Stream 8 for 1-2 year timeframe as Web5 space matures. +- Graphics to help people get involved + - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0010/reply_0000.md b/docs/discussions/alice_engineering_comms/0010/reply_0000.md new file mode 100644 index 0000000000..e79e2578ae --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0010/reply_0000.md @@ -0,0 +1,994 @@ +## 2022-07-29 @pdxjohnny Engineering Logs + +- AppSec PNW 2022 Talk playlist: https://youtube.com/playlist?list=PLfoJYLR9vr_IAd1vYWdKCOO4YYpGFVv99 + - John^2: Living Threat Models are Better Than Dead Threat Models + - Not yet uploaded but has Alice's first live demo +- https://towardsdatascience.com/installing-multiple-alternative-versions-of-python-on-ubuntu-20-04-237be5177474 + - `$ sudo update-alternatives --install /usr/bin/python python /usr/bin/python3 40` +- References + - https://tenor.com/search/alice-gifs + - https://tenor.com/view/why-thank-you-thanks-bow-thank-you-alice-in-wonderland-gif-3553903 + - Alice curtsy + - https://tenor.com/view/alice-in-wonderland-gif-26127117 + - Alice blows out unbirthday cake candle + +```console +$ alice; sleep 3; gif-for-cli -l 0 --rows $(tput lines) --cols $(tput cols) 3553903 +``` + +```console +$ gif-for-cli --rows `tput lines` --cols `tput cols` --export=alice-search-alices-adventures-in-wonderland-1.gif "Alice curtsy" +(why-thank-you-thanks-bow-thank-you-alice-in-wonderland-gif-3553903) +$ gif-for-cli --rows `tput lines` --cols `tput cols` --export=ascii-gif-alice-unbirthday-blow-out-candles-0.gif 26127117 +$ gif-for-cli --rows `tput lines` --cols `tput cols` ascii-gif-alice-unbirthday-blow-out-candles-0.gif +$ echo gif-for-cli --rows `tput lines` --cols `tput cols` +gif-for-cli --rows 97 --cols 320 +$ gif-for-cli -l 0 --rows `tput lines` --cols `tput cols` /mnt/c/Users/Johnny/Downloads/ascii-alices-adventures-in-wonderland-1.gif` +``` + +### Exploring a Helper Around Run DataFlow run_custom + +- Realized we already have the lock because it's on `git_repository` at `flow_depth=1` + +```diff +diff --git a/dffml/df/base.py b/dffml/df/base.py +index 4f84c1c7c8..2da0512602 100644 +--- a/dffml/df/base.py ++++ b/dffml/df/base.py +@@ -404,14 +404,19 @@ def op( + ) + + definition_name = ".".join(name_list) ++ print("FEEDFACE", name, definition_name) + if hasattr(param_annotation, "__supertype__") and hasattr( + param_annotation, "__name__" + ): ++ if "repo" in definition_name: ++ breakpoint() + definition_name = param_annotation.__name__ ++ print("FEEDFACE", name, definition_name) + if inspect.isclass(param_annotation) and hasattr( + param_annotation, "__qualname__" + ): + definition_name = param_annotation.__qualname__ ++ print("FEEDFACE", name, definition_name) + + if isinstance(param_annotation, Definition): + kwargs["inputs"][name] = param_annotation +diff --git a/dffml/df/types.py b/dffml/df/types.py +index f09a8a3cea..54840f58c0 100644 +--- a/dffml/df/types.py ++++ b/dffml/df/types.py +@@ -44,6 +44,7 @@ APPLY_INSTALLED_OVERLAYS = _APPLY_INSTALLED_OVERLAYS() + + + Expand = Union ++LockReadWrite = Union + + + primitive_types = (int, float, str, bool, dict, list, bytes) +@@ -65,7 +66,7 @@ def find_primitive(new_type: Type) -> Type: + ) + + +-def new_type_to_defininition(new_type: Type) -> Type: ++def new_type_to_defininition(new_type: Type, lock: bool = False) -> Type: + """ + >>> from typing import NewType + >>> from dffml import new_type_to_defininition +@@ -77,6 +78,7 @@ def new_type_to_defininition(new_type: Type) -> Type: + return Definition( + name=new_type.__name__, + primitive=find_primitive(new_type).__qualname__, ++ lock=lock, + links=( + create_definition( + find_primitive(new_type).__qualname__, new_type.__supertype__ +@@ -95,7 +97,28 @@ class CouldNotDeterminePrimitive(Exception): + """ + + +-def resolve_if_forward_ref(param_annotation, forward_refs_from_cls): ++DEFAULT_DEFINTION_ANNOTATIONS_HANDLERS = { ++ LockReadWrite: lambda definition: setattr(definition, "lock", True), ++} ++ ++ ++def resolve_if_forward_ref( ++ param_annotation, ++ forward_refs_from_cls, ++ *, ++ defintion_annotations_handlers=None, ++) -> Tuple[Union["Definition", Any], bool]: ++ """ ++ Return values: ++ ++ param_or_definition: Union[Definition, Any] ++ lock: bool ++ ++ If the definition should be locked or not. ++ """ ++ if defintion_annotations_handlers is None: ++ defintion_annotations_handlers = DEFAULT_DEFINTION_ANNOTATIONS_HANDLERS ++ annotations = {} + if isinstance(param_annotation, ForwardRef): + param_annotation = param_annotation.__forward_arg__ + if ( +@@ -104,11 +127,22 @@ def resolve_if_forward_ref(param_annotation, forward_refs_from_cls): + and hasattr(forward_refs_from_cls, param_annotation) + ): + param_annotation = getattr(forward_refs_from_cls, param_annotation) ++ # Check if are in an annotation ++ param_annotation_origin = get_origin(param_annotation) ++ if param_annotation_origin in defintion_annotations_handlers: ++ annotations[ ++ param_annotation_origin ++ ] = defintion_annotations_handlers[param_annotation_origin] ++ param_annotation = list(get_args(param_annotation))[0] ++ # Create definition + if hasattr(param_annotation, "__name__") and hasattr( + param_annotation, "__supertype__" + ): + # typing.NewType support +- return new_type_to_defininition(param_annotation) ++ definition = new_type_to_defininition(param_annotation) ++ for handler in annotations.values(): ++ handler(definition) ++ return definition + return param_annotation + + +@@ -118,6 +152,7 @@ def _create_definition( + default=NO_DEFAULT, + *, + forward_refs_from_cls: Optional[object] = None, ++ lock: bool = False, + ): + param_annotation = resolve_if_forward_ref( + param_annotation, forward_refs_from_cls +@@ -138,12 +173,14 @@ def _create_definition( + elif get_origin(param_annotation) in [ + Union, + collections.abc.AsyncIterator, ++ LockReadWrite, + ]: + # If the annotation is of the form Optional + return create_definition( + name, + list(get_args(param_annotation))[0], + forward_refs_from_cls=forward_refs_from_cls, ++ lock=bool(get_origin(param_annotation) in (LockReadWrite,),), + ) + elif ( + get_origin(param_annotation) is list +@@ -235,6 +272,7 @@ def create_definition( + default=NO_DEFAULT, + *, + forward_refs_from_cls: Optional[object] = None, ++ lock: bool = False, + ): + if hasattr(param_annotation, "__name__") and hasattr( + param_annotation, "__supertype__" +@@ -246,6 +284,7 @@ def create_definition( + param_annotation, + default=default, + forward_refs_from_cls=forward_refs_from_cls, ++ lock=lock, + ) + # We can guess name if converting from NewType. However, we can't otherwise. + if not definition.name: +@@ -847,7 +886,9 @@ class DataFlow: + for operation in args: + name = getattr(getattr(operation, "op", operation), "name") + if name in operations: +- raise ValueError(f"Operation {name} given as positional and in dict") ++ raise ValueError( ++ f"Operation {name} given as positional and in dict" ++ ) + operations[name] = operation + + self.operations = operations +diff --git a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py +index 825f949d65..0ff7e11c31 100644 +--- a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py ++++ b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py +@@ -8,18 +8,21 @@ import dffml + import dffml_feature_git.feature.definitions + + +-class AliceGitRepo(NamedTuple): ++class AliceGitRepoSpec(NamedTuple): + directory: str + URL: str + + ++AliceGitRepo = dffml.LockReadWrite[AliceGitRepoSpec] ++ ++ + class AliceGitRepoInputSetContextHandle(dffml.BaseContextHandle): + def as_string(self) -> str: + return str(self.ctx.repo) + + + class AliceGitRepoInputSetContext(dffml.BaseInputSetContext): +- def __init__(self, repo: AliceGitRepo): ++ def __init__(self, repo: AliceGitRepoSpec): + self.repo = repo + + async def handle(self) -> AliceGitRepoInputSetContextHandle: +``` + +- Is this the same as what we had in c89d3d8444cdad248fce5a7fff959c9ea48a7c9d ? + +```python + async def alice_contribute_readme(self, repo: AliceGitRepo) -> ReadmeGitRepo: + key, definition = list(self.parent.op.outputs.items())[0] + await self.octx.ictx.cadd( + AliceGitRepoInputSetContext(repo), + dffml.Input( + value=repo, + definition=definition, + parents=None, + origin=(self.parent.op.instance_name, key), + ) + ) +``` + +```diff +diff --git a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py +index 825f949d65..1bc1c41e50 100644 +--- a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py ++++ b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py +@@ -203,30 +203,22 @@ class OverlayREADME: + ReadmePRBody = NewType("github.pr.body", str) + + # async def cli_run_on_repo(self, repo: "CLIRunOnRepo"): +- async def alice_contribute_readme(self, repo: AliceGitRepo) -> ReadmeGitRepo: +- # TODO Clean this up once SystemContext refactor complete +- readme_dataflow_cls_upstream = OverlayREADME +- readme_dataflow_cls_overlays = dffml.Overlay.load( +- entrypoint="dffml.overlays.alice.please.contribute.recommended_community_standards.overlay.readme" +- ) +- readme_dataflow_upstream = dffml.DataFlow( +- *dffml.object_to_operations(readme_dataflow_cls_upstream) +- ) ++ async def new_context(self, repo: AliceGitRepo) -> ReadmeGitRepo: ++ return + # auto_flow with overlays +- readme_dataflow = dffml.DataFlow( ++ dataflow = dffml.DataFlow( + *itertools.chain( + *[ + dffml.object_to_operations(cls) + for cls in [ +- readme_dataflow_cls_upstream, +- *readme_dataflow_cls_overlays, ++ upstream, ++ *overlays, + ] + ] + ) + ) + async with dffml.run_dataflow.imp( +- # dataflow=self.octx.config.dataflow, +- dataflow=readme_dataflow, ++ dataflow=dataflow, + input_set_context_cls=AliceGitRepoInputSetContext, + ) as custom_run_dataflow: + # Copy all inputs from parent context into child. We eventually +@@ -277,6 +269,18 @@ class OverlayREADME: + }, + ) + ++ async def alice_contribute_readme(self, repo: AliceGitRepo) -> ReadmeGitRepo: ++ key, definition = list(self.parent.op.outputs.items())[0] ++ await self.octx.ictx.cadd( ++ AliceGitRepoInputSetContext(repo), ++ dffml.Input( ++ value=repo, ++ definition=definition, ++ parents=None, ++ origin=(self.parent.op.instance_name, key), ++ ) ++ ) ++ + # TODO Run this system context where readme contexts is given on CLI or + # overriden via disabling of static overlay and application of overlay to + # generate contents dynamiclly. +``` + +- Visualize the flow before we attempt to add `CONTRIBUTING.md` contribution + +```console +$ dffml service dev export alice.cli:AlicePleaseContributeCLIDataFlow | tee alice.please.contribute.recommended_community_standards.json +$ (echo -e 'HTTP/1.0 200 OK\n' && dffml dataflow diagram -shortname alice.please.contribute.recommended_community_standards.json) | nc -Nlp 9999; +``` + +```mermaid +graph TD +subgraph a759a07029077edc5c37fea0326fa281[Processing Stage] +style a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a +subgraph 8cfb8cd5b8620de4a7ebe0dfec00771a[cli_has_repos] +style 8cfb8cd5b8620de4a7ebe0dfec00771a fill:#fff4de,stroke:#cece71 +d493c90433d19f11f33c2d72cd144940[cli_has_repos] +e07552ee3b6b7696cb3ddd786222eaad(cmd) +e07552ee3b6b7696cb3ddd786222eaad --> d493c90433d19f11f33c2d72cd144940 +cee6b5fdd0b6fbd0539cdcdc7f5a3324(wanted) +cee6b5fdd0b6fbd0539cdcdc7f5a3324 --> d493c90433d19f11f33c2d72cd144940 +79e1ea6822bff603a835fb8ee80c7ff3(result) +d493c90433d19f11f33c2d72cd144940 --> 79e1ea6822bff603a835fb8ee80c7ff3 +end +subgraph 0c2b64320fb5666a034794bb2195ecf0[cli_is_asking_for_recommended_community_standards] +style 0c2b64320fb5666a034794bb2195ecf0 fill:#fff4de,stroke:#cece71 +222ee6c0209f1f1b7a782bc1276868c7[cli_is_asking_for_recommended_community_standards] +330f463830aa97e88917d5a9d1c21500(cmd) +330f463830aa97e88917d5a9d1c21500 --> 222ee6c0209f1f1b7a782bc1276868c7 +ba29b52e9c5aa88ea1caeeff29bfd491(result) +222ee6c0209f1f1b7a782bc1276868c7 --> ba29b52e9c5aa88ea1caeeff29bfd491 +end +subgraph eac58e8db2b55cb9cc5474aaa402c93e[cli_is_meant_on_this_repo] +style eac58e8db2b55cb9cc5474aaa402c93e fill:#fff4de,stroke:#cece71 +6c819ad0228b0e7094b33e0634da9a38[cli_is_meant_on_this_repo] +dc7c5f0836f7d2564c402bf956722672(cmd) +dc7c5f0836f7d2564c402bf956722672 --> 6c819ad0228b0e7094b33e0634da9a38 +58d8518cb0d6ef6ad35dc242486f1beb(wanted) +58d8518cb0d6ef6ad35dc242486f1beb --> 6c819ad0228b0e7094b33e0634da9a38 +135ee61e3402d6fcbd7a219b0b4ccd73(result) +6c819ad0228b0e7094b33e0634da9a38 --> 135ee61e3402d6fcbd7a219b0b4ccd73 +end +subgraph 37887bf260c5c8e9bd18038401008bbc[cli_run_on_repo] +style 37887bf260c5c8e9bd18038401008bbc fill:#fff4de,stroke:#cece71 +9d1042f33352800e54d98c9c5a4223df[cli_run_on_repo] +e824ae072860bc545fc7d55aa0bca479(repo) +e824ae072860bc545fc7d55aa0bca479 --> 9d1042f33352800e54d98c9c5a4223df +40109d487bb9f08608d8c5f6e747042f(result) +9d1042f33352800e54d98c9c5a4223df --> 40109d487bb9f08608d8c5f6e747042f +end +subgraph 66ecd0c1f2e08941c443ec9cd89ec589[guess_repo_string_is_directory] +style 66ecd0c1f2e08941c443ec9cd89ec589 fill:#fff4de,stroke:#cece71 +737d719a0c348ff65456024ddbc530fe[guess_repo_string_is_directory] +33d806f9b732bfd6b96ae2e9e4243a68(repo_string) +33d806f9b732bfd6b96ae2e9e4243a68 --> 737d719a0c348ff65456024ddbc530fe +dd5aab190ce844673819298c5b8fde76(result) +737d719a0c348ff65456024ddbc530fe --> dd5aab190ce844673819298c5b8fde76 +end +subgraph 2bcd191634373f4b97ecb9546df23ee5[alice_contribute_contributing] +style 2bcd191634373f4b97ecb9546df23ee5 fill:#fff4de,stroke:#cece71 +a2541ce40b2e5453e8e919021011e5e4[alice_contribute_contributing] +3786b4af914402320d260d077844620e(repo) +3786b4af914402320d260d077844620e --> a2541ce40b2e5453e8e919021011e5e4 +da4270ecc44b6d9eed9809a560d24a28(result) +a2541ce40b2e5453e8e919021011e5e4 --> da4270ecc44b6d9eed9809a560d24a28 +end +subgraph 13b430e6b93de7e40957165687f8e593[contribute_contributing_md] +style 13b430e6b93de7e40957165687f8e593 fill:#fff4de,stroke:#cece71 +ff8f8968322872ccc3cf151d167e22a2[contribute_contributing_md] +4f752ce18209f62ed749e88dd1f70266(base) +4f752ce18209f62ed749e88dd1f70266 --> ff8f8968322872ccc3cf151d167e22a2 +2def8c6923c832adf33989b26c91295a(commit_message) +2def8c6923c832adf33989b26c91295a --> ff8f8968322872ccc3cf151d167e22a2 +f5548fcbcec8745ddf04104fc78e83a3(repo) +f5548fcbcec8745ddf04104fc78e83a3 --> ff8f8968322872ccc3cf151d167e22a2 +24292ae12efd27a227a0d6368ba01faa(result) +ff8f8968322872ccc3cf151d167e22a2 --> 24292ae12efd27a227a0d6368ba01faa +end +subgraph 71a5f33f393735fa1cc91419b43db115[contributing_commit_message] +style 71a5f33f393735fa1cc91419b43db115 fill:#fff4de,stroke:#cece71 +d034a42488583464e601bcaee619a539[contributing_commit_message] +c0a0fa68a872adf890ed639e07ed5882(issue_url) +c0a0fa68a872adf890ed639e07ed5882 --> d034a42488583464e601bcaee619a539 +ce14ca2191f2b1c13c605b240e797255(result) +d034a42488583464e601bcaee619a539 --> ce14ca2191f2b1c13c605b240e797255 +end +subgraph db8a1253cc59982323848f5e42c23c9d[contributing_issue] +style db8a1253cc59982323848f5e42c23c9d fill:#fff4de,stroke:#cece71 +c39bd2cc88723432048c434fdd337eab[contributing_issue] +821d21e8a69d1fa1757147e7e768f306(body) +821d21e8a69d1fa1757147e7e768f306 --> c39bd2cc88723432048c434fdd337eab +0581b90c76b0a4635a968682b060abff(repo) +0581b90c76b0a4635a968682b060abff --> c39bd2cc88723432048c434fdd337eab +809719538467f6d0bf18f7ae26f08d80(title) +809719538467f6d0bf18f7ae26f08d80 --> c39bd2cc88723432048c434fdd337eab +c9f2ea5a7f25b3ae9fbf5041be5fa071(result) +c39bd2cc88723432048c434fdd337eab --> c9f2ea5a7f25b3ae9fbf5041be5fa071 +end +subgraph 1e6046d1a567bf390566b1b995df9dcf[contributing_pr] +style 1e6046d1a567bf390566b1b995df9dcf fill:#fff4de,stroke:#cece71 +4ec1433342f2f12ab8c59efab20e7b06[contributing_pr] +bb85c3467b05192c99a3954968c7a612(base) +bb85c3467b05192c99a3954968c7a612 --> 4ec1433342f2f12ab8c59efab20e7b06 +77f6c1c6b7ee62881b49c289097dfbde(body) +77f6c1c6b7ee62881b49c289097dfbde --> 4ec1433342f2f12ab8c59efab20e7b06 +a0a2fabc65fe5601c7ea289124d04f70(head) +a0a2fabc65fe5601c7ea289124d04f70 --> 4ec1433342f2f12ab8c59efab20e7b06 +cf92708915b9f41cb490b991abd6c374(origin) +cf92708915b9f41cb490b991abd6c374 --> 4ec1433342f2f12ab8c59efab20e7b06 +210ae36c85f3597c248e0b32da7661ae(repo) +210ae36c85f3597c248e0b32da7661ae --> 4ec1433342f2f12ab8c59efab20e7b06 +1700dc637c25bd503077a2a1422142e2(title) +1700dc637c25bd503077a2a1422142e2 --> 4ec1433342f2f12ab8c59efab20e7b06 +806e8c455d2bb7ad68112d2a7e16eed6(result) +4ec1433342f2f12ab8c59efab20e7b06 --> 806e8c455d2bb7ad68112d2a7e16eed6 +end +subgraph 04c27c13241164ae88456c1377995897[contributing_pr_body] +style 04c27c13241164ae88456c1377995897 fill:#fff4de,stroke:#cece71 +a3cebe78451142664930d44ad4d7d181[contributing_pr_body] +6118470d0158ef1a220fe7c7232e1b63(contributing_issue) +6118470d0158ef1a220fe7c7232e1b63 --> a3cebe78451142664930d44ad4d7d181 +99a7dd1ae037153eef80e1dee51b9d2b(result) +a3cebe78451142664930d44ad4d7d181 --> 99a7dd1ae037153eef80e1dee51b9d2b +end +subgraph 0d4627f8d8564b6c4ba33c12dcb58fc1[contributing_pr_title] +style 0d4627f8d8564b6c4ba33c12dcb58fc1 fill:#fff4de,stroke:#cece71 +bfa172a9399604546048d60db0a36187[contributing_pr_title] +0fd26f9166ccca10c68e9aefa9c15767(contributing_issue) +0fd26f9166ccca10c68e9aefa9c15767 --> bfa172a9399604546048d60db0a36187 +77a2f9d4dfad5f520f1502e8ba70e47a(result) +bfa172a9399604546048d60db0a36187 --> 77a2f9d4dfad5f520f1502e8ba70e47a +end +subgraph c67b92ef6a2e025ca086bc2f89d9afbb[create_contributing_file_if_not_exists] +style c67b92ef6a2e025ca086bc2f89d9afbb fill:#fff4de,stroke:#cece71 +993a1fe069a02a45ba3579b1902b2a36[create_contributing_file_if_not_exists] +401c179bb30b24c2ca989c64d0b1cdc7(contributing_contents) +401c179bb30b24c2ca989c64d0b1cdc7 --> 993a1fe069a02a45ba3579b1902b2a36 +dde78f81b1bdfe02c0a2bf6e51f65cb4(repo) +dde78f81b1bdfe02c0a2bf6e51f65cb4 --> 993a1fe069a02a45ba3579b1902b2a36 +e5b8d158dc0ec476dbbd44549a981815(result) +993a1fe069a02a45ba3579b1902b2a36 --> e5b8d158dc0ec476dbbd44549a981815 +end +subgraph 4ea6696419c4a0862a4f63ea1f60c751[create_branch_if_none_exists] +style 4ea6696419c4a0862a4f63ea1f60c751 fill:#fff4de,stroke:#cece71 +502369b37882b300d6620d5b4020f5b2[create_branch_if_none_exists] +fdcb9b6113856222e30e093f7c38065e(name) +fdcb9b6113856222e30e093f7c38065e --> 502369b37882b300d6620d5b4020f5b2 +bdcf4b078985f4a390e4ed4beacffa65(repo) +bdcf4b078985f4a390e4ed4beacffa65 --> 502369b37882b300d6620d5b4020f5b2 +5a5493ab86ab4053f1d44302e7bdddd6(result) +502369b37882b300d6620d5b4020f5b2 --> 5a5493ab86ab4053f1d44302e7bdddd6 +end +subgraph b1d510183f6a4c3fde207a4656c72cb4[determin_base_branch] +style b1d510183f6a4c3fde207a4656c72cb4 fill:#fff4de,stroke:#cece71 +476aecd4d4d712cda1879feba46ea109[determin_base_branch] +ff47cf65b58262acec28507f4427de45(default_branch) +ff47cf65b58262acec28507f4427de45 --> 476aecd4d4d712cda1879feba46ea109 +150204cd2d5a921deb53c312418379a1(result) +476aecd4d4d712cda1879feba46ea109 --> 150204cd2d5a921deb53c312418379a1 +end +subgraph 2a08ff341f159c170b7fe017eaad2f18[git_repo_to_alice_git_repo] +style 2a08ff341f159c170b7fe017eaad2f18 fill:#fff4de,stroke:#cece71 +7f74112f6d30c6289caa0a000e87edab[git_repo_to_alice_git_repo] +e58180baf478fe910359358a3fa02234(repo) +e58180baf478fe910359358a3fa02234 --> 7f74112f6d30c6289caa0a000e87edab +9b92d5a346885079a2821c4d27cb5174(result) +7f74112f6d30c6289caa0a000e87edab --> 9b92d5a346885079a2821c4d27cb5174 +end +subgraph b5d35aa8a8dcd28d22d47caad02676b0[guess_repo_string_is_url] +style b5d35aa8a8dcd28d22d47caad02676b0 fill:#fff4de,stroke:#cece71 +0de074e71a32e30889b8bb400cf8db9f[guess_repo_string_is_url] +c3bfe79b396a98ce2d9bfe772c9c20af(repo_string) +c3bfe79b396a98ce2d9bfe772c9c20af --> 0de074e71a32e30889b8bb400cf8db9f +2a1c620b0d510c3d8ed35deda41851c5(result) +0de074e71a32e30889b8bb400cf8db9f --> 2a1c620b0d510c3d8ed35deda41851c5 +end +subgraph 60791520c6d124c0bf15e599132b0caf[guessed_repo_string_is_operations_git_url] +style 60791520c6d124c0bf15e599132b0caf fill:#fff4de,stroke:#cece71 +102f173505d7b546236cdeff191369d4[guessed_repo_string_is_operations_git_url] +4934c6211334318c63a5e91530171c9b(repo_url) +4934c6211334318c63a5e91530171c9b --> 102f173505d7b546236cdeff191369d4 +8d0adc31da1a0919724baf73d047743c(result) +102f173505d7b546236cdeff191369d4 --> 8d0adc31da1a0919724baf73d047743c +end +subgraph f2c7b93622447999daab403713239ada[guessed_repo_string_means_no_git_branch_given] +style f2c7b93622447999daab403713239ada fill:#fff4de,stroke:#cece71 +c8294a87e7aae8f7f9cb7f53e054fed5[guessed_repo_string_means_no_git_branch_given] +5567dd8a6d7ae4fe86252db32e189a4d(repo_url) +5567dd8a6d7ae4fe86252db32e189a4d --> c8294a87e7aae8f7f9cb7f53e054fed5 +d888e6b64b5e3496056088f14dab9894(result) +c8294a87e7aae8f7f9cb7f53e054fed5 --> d888e6b64b5e3496056088f14dab9894 +end +subgraph 113addf4beee5305fdc79d2363608f9d[github_owns_remote] +style 113addf4beee5305fdc79d2363608f9d fill:#fff4de,stroke:#cece71 +049b72b81b976fbb43607bfeeb0464c5[github_owns_remote] +6c2b36393ffff6be0b4ad333df2d9419(remote) +6c2b36393ffff6be0b4ad333df2d9419 --> 049b72b81b976fbb43607bfeeb0464c5 +19a9ee483c1743e6ecf0a2dc3b6f8c7a(repo) +19a9ee483c1743e6ecf0a2dc3b6f8c7a --> 049b72b81b976fbb43607bfeeb0464c5 +b4cff8d194413f436d94f9d84ece0262(result) +049b72b81b976fbb43607bfeeb0464c5 --> b4cff8d194413f436d94f9d84ece0262 +end +subgraph 8506cba6514466fb6d65f33ace4b0eac[alice_contribute_readme] +style 8506cba6514466fb6d65f33ace4b0eac fill:#fff4de,stroke:#cece71 +d4507d3d1c3fbf3e7e373eae24797667[alice_contribute_readme] +68cf7d6869d027ca46a5fb4dbf7001d1(repo) +68cf7d6869d027ca46a5fb4dbf7001d1 --> d4507d3d1c3fbf3e7e373eae24797667 +2f9316539862f119f7c525bf9061e974(result) +d4507d3d1c3fbf3e7e373eae24797667 --> 2f9316539862f119f7c525bf9061e974 +end +subgraph 4233e6dc67cba131d4ef005af9c02959[contribute_readme_md] +style 4233e6dc67cba131d4ef005af9c02959 fill:#fff4de,stroke:#cece71 +3db0ee5d6ab83886bded5afd86f3f88f[contribute_readme_md] +37044e4d8610abe13849bc71a5cb7591(base) +37044e4d8610abe13849bc71a5cb7591 --> 3db0ee5d6ab83886bded5afd86f3f88f +631c051fe6050ae8f8fc3321ed00802d(commit_message) +631c051fe6050ae8f8fc3321ed00802d --> 3db0ee5d6ab83886bded5afd86f3f88f +182194bab776fc9bc406ed573d621b68(repo) +182194bab776fc9bc406ed573d621b68 --> 3db0ee5d6ab83886bded5afd86f3f88f +0ee9f524d2db12be854fe611fa8126dd(result) +3db0ee5d6ab83886bded5afd86f3f88f --> 0ee9f524d2db12be854fe611fa8126dd +end +subgraph a6080d9c45eb5f806a47152a18bf7830[create_readme_file_if_not_exists] +style a6080d9c45eb5f806a47152a18bf7830 fill:#fff4de,stroke:#cece71 +67e388f508dd96084c37d236a2c67e67[create_readme_file_if_not_exists] +54faf20bfdca0e63d07efb3e5a984cf1(readme_contents) +54faf20bfdca0e63d07efb3e5a984cf1 --> 67e388f508dd96084c37d236a2c67e67 +8c089c362960ccf181742334a3dccaea(repo) +8c089c362960ccf181742334a3dccaea --> 67e388f508dd96084c37d236a2c67e67 +5cc65e17d40e6a7223c1504f1c4b0d2a(result) +67e388f508dd96084c37d236a2c67e67 --> 5cc65e17d40e6a7223c1504f1c4b0d2a +end +subgraph e7757158127e9845b2915c16a7fa80c5[readme_commit_message] +style e7757158127e9845b2915c16a7fa80c5 fill:#fff4de,stroke:#cece71 +562bdc535c7cebfc66dba920b1a17540[readme_commit_message] +0af5cbea9050874a0a3cba73bb61f892(issue_url) +0af5cbea9050874a0a3cba73bb61f892 --> 562bdc535c7cebfc66dba920b1a17540 +2641f3b67327fb7518ee34a3a40b0755(result) +562bdc535c7cebfc66dba920b1a17540 --> 2641f3b67327fb7518ee34a3a40b0755 +end +subgraph cf99ff6fad80e9c21266b43fd67b2f7b[readme_issue] +style cf99ff6fad80e9c21266b43fd67b2f7b fill:#fff4de,stroke:#cece71 +da44417f891a945085590baafffc2bdb[readme_issue] +d519830ab4e07ec391038e8581889ac3(body) +d519830ab4e07ec391038e8581889ac3 --> da44417f891a945085590baafffc2bdb +268852aa3fa8ab0864a32abae5a333f7(repo) +268852aa3fa8ab0864a32abae5a333f7 --> da44417f891a945085590baafffc2bdb +77a11dd29af309cf43ed321446c4bf01(title) +77a11dd29af309cf43ed321446c4bf01 --> da44417f891a945085590baafffc2bdb +1d2360c9da18fac0b6ec142df8f3fbda(result) +da44417f891a945085590baafffc2bdb --> 1d2360c9da18fac0b6ec142df8f3fbda +end +subgraph 7ec0442cf2d95c367912e8abee09b217[readme_pr] +style 7ec0442cf2d95c367912e8abee09b217 fill:#fff4de,stroke:#cece71 +bb314dc452cde5b6af5ea94dd277ba40[readme_pr] +127d77c3047facc1daa621148c5a0a1d(base) +127d77c3047facc1daa621148c5a0a1d --> bb314dc452cde5b6af5ea94dd277ba40 +cb421e4de153cbb912f7fbe57e4ad734(body) +cb421e4de153cbb912f7fbe57e4ad734 --> bb314dc452cde5b6af5ea94dd277ba40 +cbf7a0b88c0a41953b245303f3e9a0d3(head) +cbf7a0b88c0a41953b245303f3e9a0d3 --> bb314dc452cde5b6af5ea94dd277ba40 +e5f9ad44448abd2469b3fd9831f3d159(origin) +e5f9ad44448abd2469b3fd9831f3d159 --> bb314dc452cde5b6af5ea94dd277ba40 +a35aee6711d240378eb57a3932537ca1(repo) +a35aee6711d240378eb57a3932537ca1 --> bb314dc452cde5b6af5ea94dd277ba40 +dfcce88a7d605d46bf17de1159fbe5ad(title) +dfcce88a7d605d46bf17de1159fbe5ad --> bb314dc452cde5b6af5ea94dd277ba40 +a210a7890a7bea8d629368e02da3d806(result) +bb314dc452cde5b6af5ea94dd277ba40 --> a210a7890a7bea8d629368e02da3d806 +end +subgraph 227eabb1f1c5cc0bc931714a03049e27[readme_pr_body] +style 227eabb1f1c5cc0bc931714a03049e27 fill:#fff4de,stroke:#cece71 +2aea976396cfe68dacd9bc7d4a3f0cba[readme_pr_body] +c5dfd309617c909b852afe0b4ae4a178(readme_issue) +c5dfd309617c909b852afe0b4ae4a178 --> 2aea976396cfe68dacd9bc7d4a3f0cba +40ddb5b508cb5643e7c91f7abdb72b84(result) +2aea976396cfe68dacd9bc7d4a3f0cba --> 40ddb5b508cb5643e7c91f7abdb72b84 +end +subgraph 48687c84e69b3db0acca625cbe2e6b49[readme_pr_title] +style 48687c84e69b3db0acca625cbe2e6b49 fill:#fff4de,stroke:#cece71 +d8668ff93f41bc241c8c540199cd7453[readme_pr_title] +3b2137dd1c61d0dac7d4e40fd6746cfb(readme_issue) +3b2137dd1c61d0dac7d4e40fd6746cfb --> d8668ff93f41bc241c8c540199cd7453 +956e024fde513b3a449eac9ee42d6ab3(result) +d8668ff93f41bc241c8c540199cd7453 --> 956e024fde513b3a449eac9ee42d6ab3 +end +subgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL] +style d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71 +f577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL] +7440e73a8e8f864097f42162b74f2762(URL) +7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40 +8e39b501b41c5d0e4596318f80a03210(valid) +f577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210 +end +subgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo] +style af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71 +155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo] +eed77b9eea541e0c378c67395351099c(URL) +eed77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5 +8b5928cd265dd2c44d67d076f60c8b05(ssh_key) +8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5 +4e1d5ea96e050e46ebf95ebc0713d54c(repo) +155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c +6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL} +6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5 +end +subgraph d3d91578caf34c0ae944b17853783406[git_repo_default_branch] +style d3d91578caf34c0ae944b17853783406 fill:#fff4de,stroke:#cece71 +546062a96122df465d2631f31df4e9e3[git_repo_default_branch] +181f1b33df4d795fbad2911ec7087e86(repo) +181f1b33df4d795fbad2911ec7087e86 --> 546062a96122df465d2631f31df4e9e3 +57651c1bcd24b794dfc8d1794ab556d5(branch) +546062a96122df465d2631f31df4e9e3 --> 57651c1bcd24b794dfc8d1794ab556d5 +5ed1ab77e726d7efdcc41e9e2f8039c6(remote) +546062a96122df465d2631f31df4e9e3 --> 5ed1ab77e726d7efdcc41e9e2f8039c6 +4c3cdd5f15b7a846d291aac089e8a622{no_git_branch_given} +4c3cdd5f15b7a846d291aac089e8a622 --> 546062a96122df465d2631f31df4e9e3 +end +end +subgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage] +style a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a +end +subgraph 58ca4d24d2767176f196436c2890b926[Output Stage] +style 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a +end +subgraph inputs[Inputs] +style inputs fill:#f6dbf9,stroke:#a178ca +128516cfa09b0383023eab52ee24878a(seed
dffml.util.cli.CMD) +128516cfa09b0383023eab52ee24878a --> e07552ee3b6b7696cb3ddd786222eaad +ba29b52e9c5aa88ea1caeeff29bfd491 --> cee6b5fdd0b6fbd0539cdcdc7f5a3324 +128516cfa09b0383023eab52ee24878a(seed
dffml.util.cli.CMD) +128516cfa09b0383023eab52ee24878a --> 330f463830aa97e88917d5a9d1c21500 +128516cfa09b0383023eab52ee24878a(seed
dffml.util.cli.CMD) +128516cfa09b0383023eab52ee24878a --> dc7c5f0836f7d2564c402bf956722672 +ba29b52e9c5aa88ea1caeeff29bfd491 --> 58d8518cb0d6ef6ad35dc242486f1beb +79e1ea6822bff603a835fb8ee80c7ff3 --> e824ae072860bc545fc7d55aa0bca479 +135ee61e3402d6fcbd7a219b0b4ccd73 --> e824ae072860bc545fc7d55aa0bca479 +40109d487bb9f08608d8c5f6e747042f --> 33d806f9b732bfd6b96ae2e9e4243a68 +dd5aab190ce844673819298c5b8fde76 --> 3786b4af914402320d260d077844620e +9b92d5a346885079a2821c4d27cb5174 --> 3786b4af914402320d260d077844620e +150204cd2d5a921deb53c312418379a1 --> 4f752ce18209f62ed749e88dd1f70266 +ce14ca2191f2b1c13c605b240e797255 --> 2def8c6923c832adf33989b26c91295a +da4270ecc44b6d9eed9809a560d24a28 --> f5548fcbcec8745ddf04104fc78e83a3 +c9f2ea5a7f25b3ae9fbf5041be5fa071 --> c0a0fa68a872adf890ed639e07ed5882 +c94383981c3a071b8c3df7293c8c7c92(seed
ContributingIssueBody) +c94383981c3a071b8c3df7293c8c7c92 --> 821d21e8a69d1fa1757147e7e768f306 +da4270ecc44b6d9eed9809a560d24a28 --> 0581b90c76b0a4635a968682b060abff +90c6a88275f27b28dc12f5741ac1652f(seed
ContributingIssueTitle) +90c6a88275f27b28dc12f5741ac1652f --> 809719538467f6d0bf18f7ae26f08d80 +150204cd2d5a921deb53c312418379a1 --> bb85c3467b05192c99a3954968c7a612 +99a7dd1ae037153eef80e1dee51b9d2b --> 77f6c1c6b7ee62881b49c289097dfbde +24292ae12efd27a227a0d6368ba01faa --> a0a2fabc65fe5601c7ea289124d04f70 +b4cff8d194413f436d94f9d84ece0262 --> cf92708915b9f41cb490b991abd6c374 +da4270ecc44b6d9eed9809a560d24a28 --> 210ae36c85f3597c248e0b32da7661ae +77a2f9d4dfad5f520f1502e8ba70e47a --> 1700dc637c25bd503077a2a1422142e2 +c9f2ea5a7f25b3ae9fbf5041be5fa071 --> 6118470d0158ef1a220fe7c7232e1b63 +c9f2ea5a7f25b3ae9fbf5041be5fa071 --> 0fd26f9166ccca10c68e9aefa9c15767 +90b3c16d6d8884aa6f70b475d98f661b(seed
repo.directory.contributing.contents) +90b3c16d6d8884aa6f70b475d98f661b --> 401c179bb30b24c2ca989c64d0b1cdc7 +da4270ecc44b6d9eed9809a560d24a28 --> dde78f81b1bdfe02c0a2bf6e51f65cb4 +21ccfd2c550bd853d28581f0b0c9f9fe(seed
default.branch.name) +21ccfd2c550bd853d28581f0b0c9f9fe --> fdcb9b6113856222e30e093f7c38065e +dd5aab190ce844673819298c5b8fde76 --> bdcf4b078985f4a390e4ed4beacffa65 +9b92d5a346885079a2821c4d27cb5174 --> bdcf4b078985f4a390e4ed4beacffa65 +5a5493ab86ab4053f1d44302e7bdddd6 --> ff47cf65b58262acec28507f4427de45 +57651c1bcd24b794dfc8d1794ab556d5 --> ff47cf65b58262acec28507f4427de45 +4e1d5ea96e050e46ebf95ebc0713d54c --> e58180baf478fe910359358a3fa02234 +40109d487bb9f08608d8c5f6e747042f --> c3bfe79b396a98ce2d9bfe772c9c20af +2a1c620b0d510c3d8ed35deda41851c5 --> 4934c6211334318c63a5e91530171c9b +2a1c620b0d510c3d8ed35deda41851c5 --> 5567dd8a6d7ae4fe86252db32e189a4d +5ed1ab77e726d7efdcc41e9e2f8039c6 --> 6c2b36393ffff6be0b4ad333df2d9419 +dd5aab190ce844673819298c5b8fde76 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a +9b92d5a346885079a2821c4d27cb5174 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a +dd5aab190ce844673819298c5b8fde76 --> 68cf7d6869d027ca46a5fb4dbf7001d1 +9b92d5a346885079a2821c4d27cb5174 --> 68cf7d6869d027ca46a5fb4dbf7001d1 +150204cd2d5a921deb53c312418379a1 --> 37044e4d8610abe13849bc71a5cb7591 +2641f3b67327fb7518ee34a3a40b0755 --> 631c051fe6050ae8f8fc3321ed00802d +2f9316539862f119f7c525bf9061e974 --> 182194bab776fc9bc406ed573d621b68 +d2708225c1f4c95d613a2645a17a5bc0(seed
repo.directory.readme.contents) +d2708225c1f4c95d613a2645a17a5bc0 --> 54faf20bfdca0e63d07efb3e5a984cf1 +2f9316539862f119f7c525bf9061e974 --> 8c089c362960ccf181742334a3dccaea +1d2360c9da18fac0b6ec142df8f3fbda --> 0af5cbea9050874a0a3cba73bb61f892 +1daacccd02f8117e67ad3cb8686a732c(seed
ReadmeIssueBody) +1daacccd02f8117e67ad3cb8686a732c --> d519830ab4e07ec391038e8581889ac3 +2f9316539862f119f7c525bf9061e974 --> 268852aa3fa8ab0864a32abae5a333f7 +0c1ab2d4bda10e1083557833ae5c5da4(seed
ReadmeIssueTitle) +0c1ab2d4bda10e1083557833ae5c5da4 --> 77a11dd29af309cf43ed321446c4bf01 +150204cd2d5a921deb53c312418379a1 --> 127d77c3047facc1daa621148c5a0a1d +40ddb5b508cb5643e7c91f7abdb72b84 --> cb421e4de153cbb912f7fbe57e4ad734 +0ee9f524d2db12be854fe611fa8126dd --> cbf7a0b88c0a41953b245303f3e9a0d3 +b4cff8d194413f436d94f9d84ece0262 --> e5f9ad44448abd2469b3fd9831f3d159 +2f9316539862f119f7c525bf9061e974 --> a35aee6711d240378eb57a3932537ca1 +956e024fde513b3a449eac9ee42d6ab3 --> dfcce88a7d605d46bf17de1159fbe5ad +1d2360c9da18fac0b6ec142df8f3fbda --> c5dfd309617c909b852afe0b4ae4a178 +1d2360c9da18fac0b6ec142df8f3fbda --> 3b2137dd1c61d0dac7d4e40fd6746cfb +8d0adc31da1a0919724baf73d047743c --> 7440e73a8e8f864097f42162b74f2762 +8d0adc31da1a0919724baf73d047743c --> eed77b9eea541e0c378c67395351099c +a6ed501edbf561fda49a0a0a3ca310f0(seed
git_repo_ssh_key) +a6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05 +8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce +4e1d5ea96e050e46ebf95ebc0713d54c --> 181f1b33df4d795fbad2911ec7087e86 +end +``` + +- Notes + - `create_*_if_not_exists` doesn't appear connected. +- Only either README or CONTRIBUTING is currently being added when + we run with our new CONTRIBUTING contribution flow overlayed. + +```console +$ for pr in $(gh -R https://github.com/pdxjohnny/testaaaa pr list --json number --jq '.[].number'); do gh -R https://github.com/pdxjohnny/testaaaa pr close "${pr}"; done +✓ Closed pull request #222 (Recommended Community Standard: README) +✓ Closed pull request #219 (Recommended Community Standard: CONTRIBUTING) +$ nodemon -e py --exec 'clear; for pr in $(gh -R https://github.com/pdxjohnny/testaaaa pr list --json number --jq '.[].number'); do gh -R https://github.com/pdxjohnny/testaaaa pr close "${pr}"; done; (alice please contribute -log debug -repos https://github.com/pdxjohnny/testaaaa -- recommended community standards; gh -R https://github.com/pdxjohnny/testaaaa pr list) 2>&1 | tee .output/$(date +%4Y-%m-%d-%H-%M).txt; test 1' +$ less -S .output/$(ls .output/ | tail -n 1) +``` + +### Refactor into README and CONTRIBUTING Overlays + +- Had the thought, aren't we just adding a new context here? + +```diff +diff --git a/dffml/df/memory.py b/dffml/df/memory.py +index 59286d4927..87c75d637b 100644 +--- a/dffml/df/memory.py ++++ b/dffml/df/memory.py +@@ -377,6 +377,7 @@ class MemoryInputNetworkContext(BaseInputNetworkContext): + self.ctxhd[handle_string].by_origin[item.origin] = [] + # Add input to by origin set + self.ctxhd[handle_string].by_origin[item.origin].append(item) ++ self.logger.debug("Added to %s: %r", handle_string, item) + + async def uadd(self, *args: Input): + """ +diff --git a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py +index 2873a1b193..cc4d374e57 100644 +--- a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py ++++ b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py +@@ -1,7 +1,8 @@ ++import asyncio + import pathlib + import textwrap + import itertools +-from typing import NamedTuple, NewType, Optional ++from typing import NamedTuple, NewType, Optional, Type, Any + + + import dffml +@@ -183,6 +184,34 @@ class OverlayGitHub: + return remote + + ++async def context_adder( ++ self, ++ upstream_cls: Type[Any], ++ input_set_context: dffml.BaseInputSetContext, ++ value: Any, ++): ++ upstream = dffml.DataFlow(*dffml.object_to_operations(upstream_cls)) ++ key, definition = list(self.parent.op.outputs.items())[0] ++ async with self.octx.ictx.definitions(self.ctx) as definitions: ++ await self.octx.ictx.cadd( ++ input_set_context, ++ dffml.Input( ++ value=value, ++ definition=definition, ++ parents=None, ++ origin=(self.parent.op.instance_name, key), ++ ), ++ *[ ++ item ++ async for item in definitions.inputs() ++ if ( ++ item.definition in upstream.definitions.values() ++ and item.definition not in self.parent.op.inputs.values() ++ ) ++ ], ++ ) ++ ++ + # NOTE Not sure if the orchestrator will know what to do if we do this + # ReadmeGitRepo = AliceGitRepo + class ReadmeGitRepo(NamedTuple): +@@ -204,6 +233,9 @@ class OverlayREADME: + + # async def cli_run_on_repo(self, repo: "CLIRunOnRepo"): + async def alice_contribute_readme(self, repo: AliceGitRepo) -> ReadmeGitRepo: ++ # await context_adder( ++ # self, OverlayREADME, AliceGitRepoInputSetContext(repo), repo ++ # ) + # TODO Clean this up once SystemContext refactor complete + readme_dataflow_cls_upstream = OverlayREADME + readme_dataflow_cls_overlays = dffml.Overlay.load( +``` + +``` +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Stage: PROCESSING: alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Inputs: {'default_branch': 'master'} +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Conditions: {} +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Outputs: {'result': 'master'} +DEBUG:dffml.MemoryOperationImplementationNetworkContext:--- +DEBUG:dffml.MemoryInputNetworkContext:Added to https://github.com/pdxjohnny/testaaaa: Input(value=master, definition=repo.git.base.branch) +DEBUG:dffml.MemoryLockNetworkContext:Acquiring: 6fc55525-c499-421c-8b07-497dd277b1ff(GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa')) (now held by Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing', inputs={'repo': AliceGitRepo}, outputs={'result': ContributingGitRepo}, stage=, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing', validator=False, retry=0)) +DEBUG:dffml.MemoryOperationImplementationNetworkContext:--- +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing Stage: PROCESSING: alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing Inputs: {'repo': GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa')} +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing Conditions: {} +DEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'), definition=ContributingGitRepo) +DEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=origin, definition=writable.github.remote.origin) +DEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=master, definition=repo.git.base.branch) +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing Outputs: None +DEBUG:dffml.MemoryOperationImplementationNetworkContext:--- +DEBUG:dffml.MemoryLockNetworkContext:Acquiring: 6fc55525-c499-421c-8b07-497dd277b1ff(GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa')) (now held by Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists', inputs={'repo': AliceGitRepo, 'name': default.branch.name}, outputs={'result': git_branch}, stage=, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists', validator=False, retry=0)) +DEBUG:dffml.MemoryOperationImplementationNetworkContext:--- +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists Stage: PROCESSING: alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists Inputs: {'repo': GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'), 'name': 'main'} +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists Conditions: {} +DEBUG:dffml_feature_git.util:proc.create: ('git', 'branch', '-r') +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists Outputs: None +DEBUG:dffml.MemoryOperationImplementationNetworkContext:--- +DEBUG:dffml.MemoryLockNetworkContext:Acquiring: 6fc55525-c499-421c-8b07-497dd277b1ff(GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa')) (now held by Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', inputs={'repo': AliceGitRepo}, outputs={'result': ReadmeGitRepo}, stage=, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', validator=False, retry=0)) +DEBUG:dffml.MemoryOperationImplementationNetworkContext:--- +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Stage: PROCESSING: alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Inputs: {'repo': GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa')} +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Conditions: {} +DEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo) +DEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=origin, definition=writable.github.remote.origin) +DEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=master, definition=repo.git.base.branch) +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None +DEBUG:dffml.MemoryOperationImplementationNetworkContext:--- +DEBUG:dffml.MemoryInputNetworkContext:Received https://github.com/pdxjohnny/testaaaa result {} from +DEBUG:dffml.MemoryInputNetworkContext:Received https://github.com/pdxjohnny/testaaaa result {} from +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.cli.OverlayCLI:cli_run_on_repo Outputs: None +DEBUG:dffml.MemoryOperationImplementationNetworkContext:--- +DEBUG:dffml.MemoryOrchestratorContext:ctx.outstanding: 1 +DEBUG:dffml.MemoryInputNetworkContext:Received 9eda82af632e2587d31fcd06d5fb0bfb1df47c4a8383e6a998f26c7c4906a86b result {} from +DEBUG:dffml.MemoryOrchestratorContext:ctx.outstanding: 0 +https://github.com/pdxjohnny/testaaaa {} +9eda82af632e2587d31fcd06d5fb0bfb1df47c4a8383e6a998f26c7c4906a86b {} +``` + +- Want to understand why we are not waiting for the contexts to complete which were added + in above diff and logs. + - Fallback plan is to call both from a function in a separate overlay until it's working + this will just call `run_custom` via a helper function for both README and CONTRIBUTING + overlays. + - Going to write this first, then contributing new file tutorial + - Then tutorial on `alice shouldi contribute` with overlay addition via installed to entrypoint + - Then test with ability to add overlays via CLI as one offs + - Final bit of each tutorial is to add to this fallback overlay + - If we still have time before 8 AM then we'll try to debug +- alice: please: contribute: recommended community standards: readme: Scope PR title and body definitions + - 1cf1d73bcdb8f0940c02e01dec1e26253c2ec4cf +- Tried with `dffml.run()`, it worked right away. Going with this. + - 1bf5e4c9a4eae34f30f9c4b5c9a04d09d6a11c6e + - alice: please: contribute: recommended community standards: readme: Use dffml.subflow_typecast to execute README contribution + - 85d57ad8989bfb12d5fe0fb6eec21002ce75f271 + - high level: subflow typecast: Basic OpImpCtx helper + - 8c0531e5364c09fec019d1971e4033401bfcbd2b + - overlay: static overlay application with loading entrypoint dataflow class with overlays applied. + - af4306a500daf11ba3c4c3db39c1da9879456d12 + - alice: please: contribute: recommended community standards: Disable OverlayMetaIssue in default installed set of overlays + + +### How to help Alice contribute more files + +This tutorial will help you create a new Open Architecture / Alice +overlay which runs when another flow runs. The upstream flow +in this case is the `AlicePleaseContributeRecommendedCommunityStandards` +base flow. + +- Copy readme overlay to new file + +```console +$ cp alice/please/contribute/recommended_community_standards/readme.py alice/please/contribute/recommended_community_standards/contribute.py +``` + +- Rename types, classes, variables, etc. + +```console +$ sed -e 's/Readme/Contributing/g' -e 's/README/CONTRIBUTING/g' -e 's/readme/contributing/g' -i alice/please/contribute/recommended_community_standards/contribute.py +``` + +```diff +diff --git a/entities/alice/entry_points.txt b/entities/alice/entry_points.txt +index 129b2866a1..9e130cb3b2 100644 +--- a/entities/alice/entry_points.txt ++++ b/entities/alice/entry_points.txt +@@ -9,6 +9,7 @@ CLI = alice.please.contribute.recomme + OverlayGit = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGit + OverlayGitHub = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGitHub + OverlayREADME = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayREADME ++OverlayCONTRIBUTING = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayCONTRIBUTING + # OverlayMetaIssue = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayMetaIssue + + [dffml.overlays.alice.please.contribute.recommended_community_standards.overlay.readme] +``` + +**dffml.git/entites/alice/entry_points.txt** + +```ini +[dffml.overlays.alice.please.contribute.recommended_community_standards.overlay.contributing] +OverlayGit = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGit +OverlayGitHub = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGitHu +``` + +- Reinstall for new entrypoints to take effect + +```console +$ python -m pip install -e . +``` + +- Re-run the command and observe results + +```console +for pr in $(gh -R https://github.com/$USER/ pr list --json number --jq '.[].number'); do gh -R https://github.com/pdxjohnny/testaaaa pr close "${pr}"; done; (alice please contribute -log debug -repos https://github.com/pdxjohnny/testaaaa -- recommended community standards; gh -R https://github.com/pdxjohnny/testaaaa pr list +``` + +![Screenshot showing pull request for adding README.md and CONTRIBUTING.md and CODE_OF_CONDUCT.md files](https://user-images.githubusercontent.com/5950433/181826046-53ae3ef5-6750-48ad-afd2-8cf9174e0b63.png) + +### Script to test Coach Alice Our Open Source Guide tutorial + +```bash +#!/usr/bin/env bash +set -x +set -e + +# export USER=githubusername +export REPO_URL="https://github.com/$USER/my-new-python-project" + +cd $(mktemp -d) + +git clone --depth=1 -b alice https://github.com/intel/dffml dffml +cd dffml/entities/alice +python -m venv .venv +. .venv/bin/activate +python -m pip install -U pip setuptools wheel +python -m pip install \ + -e .[dev] \ + -e ../../ \ + -e ../../examples/shouldi/ \ + -e ../../feature/git/ \ + -e ../../operations/innersource/ \ + -e ../../configloader/yaml/ + +gh repo create -y --private "${REPO_URL}" +git clone "${REPO_URL}" +cd my-new-python-project +echo 'print("Hello World")' > test.py +git add test.py +git commit -sam 'Initial Commit' +git push --set-upstream origin $(git branch --show-current) +cd .. +rm -rf my-new-python-project + +cp alice/please/contribute/recommended_community_standards/readme.py alice/please/contribute/recommended_community_standards/code_of_conduct.py + +sed -e 's/Readme/CodeOfConduct/g' -e 's/README/CODE_OF_CONDUCT/g' -e 's/readme/code_of_conduct/g' -i alice/please/contribute/recommended_community_standards/code_of_conduct.py + +sed -i 's/OverlayREADME .*/&\nOverlayCODE_OF_CONDUCT = alice.please.contribute.recommended_community_standards.code_of_conduct:OverlayCODE_OF_CONDUCT/' entry_points.txt + +tee -a entry_points.txt << 'EOF' + +[dffml.overlays.alice.please.contribute.recommended_community_standards.code_of_conduct] +OverlayGit = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGit +OverlayGitHub = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGitHub +EOF + +python -m pip install -e . + +alice please contribute -log debug -repos "${REPO_URL}" -- recommended community standards + +gh -R "${REPO_URL}" pr list +# 343 Recommended Community Standard: README alice-contribute-recommended-community-standards-readme OPEN +# 341 Recommended Community Standard: CONTRIBUTING alice-contribute-recommended-community-standards-contributing OPEN +# 339 Recommended Community Standard: CODE_OF_CONDUCT alice-contribute-recommended-community-standards-code_of_conduct OPEN + +for pr in $(gh -R "${REPO_URL}" pr list --json number --jq '.[].number'); +do + gh -R "${REPO_URL}" pr close "${pr}" +done +``` + +- The Alice codebase + +```console +$ find alice/please/ -type f | grep -v __init +alice/please/contribute/recommended_community_standards/contributing.py +alice/please/contribute/recommended_community_standards/cli.py +alice/please/contribute/recommended_community_standards/readme.py +alice/please/contribute/recommended_community_standards/meta_issue.py +alice/please/contribute/recommended_community_standards/recommended_community_standards.py +``` + +### TODOs + +- Explain how to add more top level Alice CLI comamnds +- Explain how to overlay shouldi flows beyond standard DFFML docs. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0011/index.md b/docs/discussions/alice_engineering_comms/0011/index.md new file mode 100644 index 0000000000..7bdcbf97de --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0011/index.md @@ -0,0 +1,7 @@ +# 2022-08-22 Engineering Logs + +- SCITT + - https://notes.ietf.org/notes-ietf-114-scitt + - https://youtu.be/6B8Bv0naAIA + - https://mailarchive.ietf.org/arch/msg/scitt/b1bvDwutpAdLI7sa7FzXrtkY_m0/ + - https://mailarchive.ietf.org/arch/msg/scitt/iEAhuuicVxgoXJiAZIGmpZOctcc/# \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0011/reply_0000.md b/docs/discussions/alice_engineering_comms/0011/reply_0000.md new file mode 100644 index 0000000000..b0bd23771f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0011/reply_0000.md @@ -0,0 +1,12 @@ +## 2022-08-22 @pdxjohnny Engineering Logs + +- SCITT + - https://notes.ietf.org/notes-ietf-114-scitt + - https://youtu.be/6B8Bv0naAIA + - https://mailarchive.ietf.org/arch/msg/scitt/b1bvDwutpAdLI7sa7FzXrtkY_m0/ + - https://mailarchive.ietf.org/arch/msg/scitt/iEAhuuicVxgoXJiAZIGmpZOctcc/# +- TODO + - [ ] Update with some of the very spotty wording above and try to flush it out with more conceptual meat now that the tone is established / future John has an example to work with. + - https://github.com/intel/dffml/commit/9aeb7f19e541e66fc945c931801215560a8206d7 + - [ ] Update somewhere else in Vol 1 to include from + - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0015/reply_0002.md \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0012/index.md b/docs/discussions/alice_engineering_comms/0012/index.md new file mode 100644 index 0000000000..25acbbeb5c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0012/index.md @@ -0,0 +1,17 @@ +# 2022-08-24 Engineering Logs + +- SCITT + - https://mailarchive.ietf.org/arch/msg/scitt/R56CX1LqSgDBRCzZIk3pZnJEV_c/ + - “ +In summary, a NIST Vulnerability Disclosure Report (VDR) is an attestation +by a software vendor showing that the vendor has checked each component of a +software product SBOM for vulnerabilities and reports on the details of any +vulnerabilities reported by a NIST NVD search. The VDR is a living document +which the software vendor updates as needed when new vulnerabilities have +been discovered and reported. A VDR is published whenever a software vendor +issues a new or updated SBOM, including initial product release, making it +available online, all the time, to all customers of the product described in +the VDR. This gives software consumers that ability to answer the question +"What is the vulnerability status of my software product from Vendor V, as +of NOW?".” + - From VEX to VDR? Lets dive in more next week \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0012/reply_0000.md b/docs/discussions/alice_engineering_comms/0012/reply_0000.md new file mode 100644 index 0000000000..8df60787eb --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0012/reply_0000.md @@ -0,0 +1,14 @@ +## 2022-08-24 @sedihglow Engineering Logs + +- Alice + - Ran through contributing setup on local PC + - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst#cloning-the-repo +- [ ] `alice please build if needed and run /path/to/repo` + - Try two different repos, mainly focused on C + - https://github.com/sedihglow/rpi4 + - https://github.com/sedihglow/red_black_tree + +```console +$ sudo update-alternatives: using /usr/bin/python3.9 to provide /usr/local/bin/python (python) in auto mode +$ sudo apt-get update && sudo apt-get install -y tmux python3.9 python3-pip python3.9-venv python3.9-dev build-essential +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0012/reply_0001.md b/docs/discussions/alice_engineering_comms/0012/reply_0001.md new file mode 100644 index 0000000000..e769d86715 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0012/reply_0001.md @@ -0,0 +1 @@ +https://datatracker.ietf.org/doc/draft-ietf-rats-architecture/ \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0013/index.md b/docs/discussions/alice_engineering_comms/0013/index.md new file mode 100644 index 0000000000..c7ac1eb1ba --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0013/index.md @@ -0,0 +1,32 @@ +- Policy + - ABC’s of Conformity Assessment + - https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.2000-01.pdf + - This might be helpful later when we write docs for / think about how to apply policy (see vol 0 introduction arch diagram) +- SCITT + - Zachary Newman shared looking at OpenSSF / SCITT terminology ran into same topics that we did when we brought up using shared underlying protocols and formats in the [2022-07-25 SCITT meeting](https://github.com/intel/dffml/discussions/1406#discussioncomment-3223361) when talking about RATs style attestation vs SLSA/in-toto/sigstore style. + - https://mailarchive.ietf.org/arch/msg/scitt/utSOqlCifoorbqUGWNf-wMlBYR4/ + - Dick agrees with Zach's analysis. "I've also been monitoring the OpenSSF Scorecard initiative, which goes beyond sigstore attestation checking to assign a "trust score". Not sure if this has traction, but there is a lot of activity on github. https://github.com/ossf/scorecard/blob/main/README.md#basic-usage OpenSSF does NOT appear to be following/implementing NIST C-SCRM recommendations and standards for Executive Order 14028 and consumer software labeling and other attestation recommendations; https://www.nist.gov/document/software-supply-chain-security-guidance-under-executive-order-eo-14028-section-4e" [Dick Brooks] + - Commit message to charter + - > The Endor POC by `@OR13` was exemplary because there was a low amount of abstraction / extra information / steps introduced for the learner to understand the sequence of data transformations involved. It makes clear the contents of the serialization of choice (DIDs + VCs in Endor's case) and how that varies across the steps. The POC provided immediate value on the mailing list in a way that examples which introduce more abstraction layers are unable to do as quickly. + > + > We apply our recent learning from this success by adding to the charter the production of a similar example which in this patch we call "file-based", but we could change that to a more descriptive term if there is one. Having an example similar to the learning methodology presented via Endor would accelerate the pace at which developers up and down the software stack and in different programming languages would be able to adopt SCITT. This is due to the low level of abstraction introduced by it's file and shell based implementation. Files and shell commands translate easy into other languages where they can be slowly swapped out from initial fork/exec and equivalents to language code. + > + > The SCITT community could potentially provide documentation on how the fork/exec style implementation could be transformed into the HTTP server implementation. Due to the generic nature of SCITT and the many touchpoints various software systems will likely have with it in the future. It is important for us to consider as a part of our threat model the effect cohesive example documentation has on the correctness of downstream implementations. Providing cohesive examples where we start with the basics (file-based), moving to an example environment implementers are likely to be working in (HTTP-based), and finally explaining how we went from the basic to the complex would give a robust view of what SCITT should look like to implementers and provide them with a clear path to a hopefully correct implementation. + > + > More cohesive documentation will reduce the number of security vulnerabilities we see in our communities code. Code which is fundamentally about security in nature. This modification to the charter seeks to act on recent learnings around example code experienced within the SCITT community itself and seeks to contribute to the development of our threat model as we think about SCITT's lifecycle and rollout. + - For this reason I propose we + - where they will be creating the future of SCITT's robust, actively maintained solutions. + - https://mailarchive.ietf.org/arch/msg/scitt/Hz9BSiIN7JHAgsZL6MuDHK4p7P8/ + - https://github.com/OR13/endor + - This is learning methodology goldmine. + - https://github.com/ietf-scitt/charter/pull/21 + - https://mailarchive.ietf.org/arch/msg/scitt/B9cwkueu3gdQ7lBKkhILcFLD0E4/ +- RATS + - https://datatracker.ietf.org/doc/draft-ietf-rats-architecture/ +- SBOM + - We [DFFML community] intend to use the "living" SBOM VDR capabilities to facilitate the breathing of life into our living threat models. This will allow us to facilitate vulns on architecture. + - https://spdx.github.io/spdx-spec/v2.3/ + - https://energycentral.com/c/pip/what-nist-sbom-vulnerability-disclosure-report-vdr + - > The recommendation by NIST to provide software consumers with a NIST VDR is gaining traction as a best practice. The latest version of the SPDX SBOM standard, version 2.3, includes provisions (K.1.9) enabling a software vendor to associate a specific SBOM document for a software product with its online NIST VDR attestation for that product, which is linked within the SBOM. The link refers to a “living” SBOM VDR document that is updated by a software vendor, whenever new vulnerabilities are reported. Having this “always updated NIST VDR” available enables software consumers to answer the question “What is the vulnerability status of my software product from Vendor V, as of NOW?”, providing consumers with on-going, up-to-date visibility into the risks that may be present in an installed software product, as new vulnerabilities (CVE's) are being reported/released. + > + > As stated previously, NIST did not prescribe a format for a NIST VDR attestation, but guidance is provided on what a VDR includes. Reliable Energy Analytics (REA) has produced an open-source “interpretation” of what a NIST VDR contains in order to meet EO 14028, which is available here in an XML Schema format with samples provided in XML and JSON (https://raw.githubusercontent.com/rjb4standards/REA-Products/master/SBOMVDR_JSON/VDR_118.json) formats. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0013/reply_0000.md b/docs/discussions/alice_engineering_comms/0013/reply_0000.md new file mode 100644 index 0000000000..b2e6bf918d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0013/reply_0000.md @@ -0,0 +1,132 @@ +## 2022-08-29 @pdxjohnny Engineering Logs + +- Notes to self + - Watched the progress report videos to make sure I know where we're at, thanks past Johns and others + - Realized we should use `CITATION.cff` instead of `myconfig.json` in the examples under today's TODOs + - They seem to form a cohesive if a bit rambling picture. + - Reminded me why I quit caffeine. Sleep is important. + - We could probably do for a 1 minute explainer video on what is Alice + - Below "Status" would probably be a good way to start the day tomorrow as the 1 minute video with a breif bit about what is Alice at the begining. + - Alice is our developer helper. We extend her to help us understand and preform various parts of the software development lifecycle. We extend her by writing simple Python functions which are easy for anyone to distribute or combine. She is based on a programming language agnostic format known as the Open Architecture. Eventually we will be able to extend any part of her in any language, or driven by machine learning models. +- SCITT + - Watched https://www.youtube.com/watch?v=6B8Bv0naAIA&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&t=1320s + - SCITT Architecture + - ![image](https://user-images.githubusercontent.com/5950433/187310016-472934fb-e5cc-47e8-875d-a5ea93592074.png) + - Dick's comment here on verification is related to a statement I'd made earlier today + - https://www.youtube.com/watch?v=6B8Bv0naAIA&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&t=1584s + - https://github.com/ietf-scitt/charter/pull/18/files#r957557301 + - Roy + - In the case of the notary we have the opportunity to allow for claims that last longer than they are supposed to. The notary concept will allow his buddies to control certs (effectively) on their servers sides. + - Answer to: How's this related to sigstore? + - In SCITT sigstore would send contents to SCITT instance and then notary would put it on a ledger + - In the case of SLSA they also submit to the SCITT store, it looks like at the moment they just plug into one another + - Concerns that we are too software centric with current prospective charter. + - Point taken but they can't scope increase more. + - We want to align efforts across SCITT and OpenSSF to ensure we all work in the same directions + - We can expand to non software use cases later if we flush this out as is first and make sure to design it with extensibility in mind. + - Reviewed https://github.com/ietf-scitt/charter/pull/18/files#diff-7dc19c29f46d126113e2e7fb7b70710fd0fd3100c95564297664f8ceae8c653eR8 + - "For example, a public computer interface system could report its software composition, which can be compared against known software compositions for such a device, as recorded in a public append-only transparent registry." (https://github.com/ietf-scitt/charter/tree/60e628f1d718b69dc0d02f7a8168a5485f818201) + - This sounds very similar to something we've talked about before which may be in a stream recording of how we identify the devices which aren't known to be running the "machines serve humans" rule, etc. + - This is important for either SCITT or OA to address + - https://github.com/ietf-scitt/charter/pull/18#pullrequestreview-1089013246 +- Status + - We want to make sure the contribution process works and is clear. Then we will move on to the data collection portion. Remember we are working over time. We are building the entity at the center of the Trinity, Alice. Please contribute falls under our Static Analysis portion. The Open Architecture, SCITT, SBOM all are used in our top portion, Intent. We are building the entity using the architecture which we will use the represent the findings of our static and dynamic analysis. + - Alice can make contributions, we've laid the foundations for the automation of the software development process. Our next step is to help her understand what she's looking at, what is the code, how can she use the source Luke? Later we'll get into more details on the dynamic analysis portion of the Trinity, where we'll work, over time, across many program executions of the code we are working on, to understand how it's execution maps to the work that we're doing via our understanding of what we've done (`please contribute`) and what we we're doing it on (`alice shouldi contribute`). + - As such our top priorities right now are + - Ensuring the contribution process to what exists (`alice please contribute`) is rock solid. + - Building out and making `alice shouldi contribute` accessible and ready for contribution. + - Engaging with those that are collecting metrics (https://metrics.openssf.org) and ensuring our work on metric collection bears fruit. + - Following our engagement on the metric collection front we will preform analysis to determine how to best target further `alice please contribute` efforts and align the two with a documented process on how we select high value targets so that others can pick up and run with extending. + - Participating organizations in parallel begin automated outreach via Alice please contribute +- Game plan. + - [x] `alice please contribute` + - [x] Contribution ready + - [ ] Demo on stream of how write install and publish a third party overlay + - Have the overlay be a function which outputs a return type of `ContributingContents` and takes the name of the project given in a `CITATIONS.cff` file of the CONTRIBUTING example. + - https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=5&t=2303 + - https://github.com/intel/dffml/blob/9aeb7f19e541e66fc945c931801215560a8206d7/entities/alice/alice/please/contribute/recommended_community_standards/contributing.py#L48-L54 + - [ ] Demo on stream how to write install and contribute a 1st/2nd party overlay, the same code just not third party, from start to finish. + - [ ] `alice shouldi contribute` + - [ ] Support caching / import / export dataflows + - [ ] Support query in easy way (graphql) + - [ ] Support joining with previous runs / more sets of data + - [ ] Contribute the data OpenSSF cares about to their DB via applicable joins and queries + - [ ] Email Christine and CRob +- TODO + - [ ] Organization + - [ ] Daily addition by Alice to engineering log following template + - [ ] Addition of old TODOs yesterday's logs + - [ ] Export end state of input network / dump everything used by orchestrator + - [ ] pickle + - [ ] JSON + - [ ] Ensure import works (check for state reset in `__aenter__()`, we probably need a generic wrapper to save the memory ones which populates after the `__aenter__()` of the wrapped object. + - [ ] GraphQl query of cached state using strawberry library or something like that + - [ ] Example docs for how to run a flow, then merge with static data as the start state for the cache and then query the whole bit with graphql + +--- + +Title: Software Supply Chain Security Guidance Under Executive Order (EO) 14028 +Section 4e +February 4, 2022 +Source: https://www.nist.gov/system/files/documents/2022/02/04/software-supply-chain-security-guidance-under-EO-14028-section-4e.pdf + +Terminology +Section 4e uses several terms, including “conformity,” “attestation,” and “artifacts.” Because EO 14028 +does not define these terms, this guidance presents the following definitions from existing standards +and guidance: +• Conformity assessment is a “demonstration that specified requirements are fulfilled.” [ISO/IEC +17000] In the context of Section 4e, the requirements are secure software development +practices, so conformity assessment is a demonstration that the software producer has followed +secure software development practices for their software. +• Attestation is the “issue of a statement, based on a decision, that fulfillment of specified +requirements has been demonstrated.” [ISO/IEC 17000] +3 +o If the software producer itself attests that it conforms to secure software development +practices, this is known by several terms, including first-party attestation, selfattestation, declaration, and supplier’s declaration of conformity (SDoC). +o If the software purchaser attests to the software producer’s conformity with secure +software development practices, this is known as second-party attestation. +o If an independent third-party attests to the software producer’s conformity with secure +software development practices, this is known as third-party attestation or +certification. +• An artifact is “a piece of evidence.” [adapted from NISTIR 7692] Evidence is “grounds for belief +or disbelief; data on which to base proof or to establish truth or falsehood.” [NIST SP 800-160 +Vol. 1] Artifacts provide records of secure software development practices. +o Low-level artifacts will be generated during software development, such as threat +models, log entries, source code files, source code vulnerability scan reports, testing +results, telemetry, or risk-based mitigation decisions for a particular piece of software. +These artifacts may be generated manually or by automated means, and they are +maintained by the software producer. +o High-level artifacts may be generated by summarizing secure software development +practices derived from the low-level artifacts. An example of a high-level artifact is a +publicly accessible document describing the methodology, procedures, and processes a +software producer uses for its secure practices for software development. +The following subsections of EO 14028 Section 4e use these terms: +(ii) generating and, when requested by a purchaser, providing artifacts that demonstrate +conformance to the processes set forth in subsection (e)(i) of this section; +(v) providing, when requested by a purchaser, artifacts of the execution of the tools and +processes described in subsection (e)(iii) and (iv) of this section, and making publicly available +summary information on completion of these actions, to include a summary description of the +risks assessed and mitigated; +(ix) attesting to conformity with secure software development practices; +In other words, when a federal agency (purchaser) acquires software or a product containing software, +the agency should receive attestation from the software producer that the software’s development +complies with government-specified secure software development practices. The federal agency might +also request artifacts from the software producer that support its attestation of conformity with the +secure software development practices described in Section 4e subsections (i), (iii), and (iv), which are +listed here: +(i) secure software development environments, including such actions as: +(A) using administratively separate build environments; +(B) auditing trust relationships; +4 +(C) establishing multi-factor, risk-based authentication and conditional access across the +enterprise; +(D) documenting and minimizing dependencies on enterprise products that are part of +the environments used to develop, build, and edit software; +(E) employing encryption for data; and +(F) monitoring operations and alerts and responding to attempted and actual cyber +incidents; +(iii) employing automated tools, or comparable processes, to maintain trusted source code +supply chains, thereby ensuring the integrity of the code; +(iv) employing automated tools, or comparable processes, that check for known and potential +vulnerabilities and remediate them, which shall operate regularly, or at a minimum prior to +product, version, or update release; \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0014/index.md b/docs/discussions/alice_engineering_comms/0014/index.md new file mode 100644 index 0000000000..51c6443f5c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0014/index.md @@ -0,0 +1 @@ +# 2022-08-30 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0014/reply_0000.md b/docs/discussions/alice_engineering_comms/0014/reply_0000.md new file mode 100644 index 0000000000..d9f4be90e4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0014/reply_0000.md @@ -0,0 +1,126 @@ +## 2022-08-30 @pdxjohnny Engineering Logs + +- SCITT + - Responded to review from Henk + - Questions around meaning of term "file-based" + - The intent of using the term "file-based" was to have an example working with a static serialized form rather than working with a dynamic abstraction layer such as HTTP. + - Updated both lines based on Henk's feedback into one line which addresses the core concern around ensuring the documentation is complete so we end up with a higher likelihood of solid implementations. + - > HTTP-based REST API for Request-Response Interactions including a critical mass of examples as implementation guidance + - https://github.com/ietf-scitt/charter/pull/21#pullrequestreview-1089717428 +- Game plan + - [x] `alice please contribute` + - [x] Contribution ready + - [ ] Demo on stream of how write install and publish a third party overlay + - Have the overlay be a function which outputs a return type of `ContributingContents` and takes the name of the project given in a `CITATIONS.cff` file of the CONTRIBUTING example. + - https://github.com/johnlwhiteman/living-threat-models/blob/c027d4e319c715adce104b95f1e88623e02b0949/CITATION.cff + - https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=5&t=2303 + - https://github.com/intel/dffml/blob/9aeb7f19e541e66fc945c931801215560a8206d7/entities/alice/alice/please/contribute/recommended_community_standards/contributing.py#L48-L54 + - [ ] Demo on stream how to write install and contribute a 1st/2nd party overlay, the same code just not third party, from start to finish. + - [ ] `alice shouldi contribute` + - [ ] Support caching / import / export dataflows + - [ ] Support query in easy way (graphql) + - [ ] Support joining with previous runs / more sets of data + - [ ] Contribute the data OpenSSF cares about to their DB via applicable joins and queries + - [ ] Email Christine and CRob +- TODO + - [ ] Organization + - [ ] Daily addition by Alice to engineering log following template + - [ ] Addition of old TODOs yesterday's logs + - [ ] Export end state of input network / dump everything used by orchestrator + - [ ] pickle + - [ ] JSON + - [ ] Ensure import works (check for state reset in `__aenter__()`, we probably need a generic wrapper to save the memory ones which populates after the `__aenter__()` of the wrapped object. + - [ ] GraphQl query of cached state using strawberry library or something like that + - [ ] Example docs for how to run a flow, then merge with static data as the start state for the cache and then query the whole bit with graphql +- TODO + - [ ] How to Publish an Alice Overlay + - [ ] How to Contribute an Alice Overlay + - [ ] Rolling Alice: 2022 Progress Reports: August Status Update + - [ ] Rolling Alice: 2022 Progress Reports: August Activities Recap + +--- + +### How to Publish an Alice Overlay + +- Metadata + - Date: 2022-08-30 10:00 UTC -7 +- Docs we are following + - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst + - https://github.com/intel/dffml/tree/alice/entities/alice#recommend-community-standards + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md + +### How to Contribute an Alice Overlay + +- Metadata + - Date: 2022-08-30 10:00 UTC -7 + + +### Rolling Alice: 2022 Progress Reports: August Status Update + +- Metadata + - Date: 2022-08-30 16:28 UTC -7 +- https://www.youtube.com/watch?v=THKMfJpPt8I&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=9 +- https://docs.google.com/presentation/d/1WBz-meM7n6nDe3-133tF1tlDQJ6nYYPySAdMgTHLb6Q/edit?usp=sharing +- https://gist.github.com/pdxjohnny/07b8c7b4a9e05579921aa3cc8aed4866 + - Progress report transcripts +- Hello entities of the internet! +- We're building Alice, an Open Artificial General Intelligence, we invite you to join us. +- Today is Alice’s unbirthday. I’m going tell you a little bit about Alice and the Open Architecture and give a brief status update on where we’re at and how you can get involved. +- Who is Alice? + - Alice will be our developer helper and one day a developer herself. She helps us understand and preform various parts of the software development lifecycle. + - We currently extend her by writing simple Python functions which can be distributed or combined in a decentralized way. + - She is built around a programming language agnostic format known as the Open Architecture. + - Eventually we will be able to extend any part of her in any language, or have parts be driven by machine learning models. +- What is the Open Architecture? + - It's the methodology that we use to interpret any domain specific description of architecture. + - We are developing the open architecture so that we can do a one hop on analysis when looking at any piece of software from a security or other angle. + - Having this generic method to describe any system architecture allows us to knit them together and assess their risk and threat model from a holistic viewpoint. +- Why work on the Open Architecture? + - We want this to be a machine and human interpretable format so that we can facilitate the validation of the reality of the code as it exists in it's static form, what it does when you execute it, and what we intend it to do. + - Intent in our case is measured by conference to and completeness of the threat model, and therefore also the associated open architecture description. +- The entity analysis Trinity + - The entity analysis Trinity helps us conceptualize our process. The points on our Trinity are Intent, Dynamic Analysis, and Static Analysis. + - By measuring and forming understanding in these areas we will be able to triangulate the strategic plans and principles involved in the execution of the software as well as it's development lifecycle. + - We use the Trinity to represent the soul of the software. +- What happens when we work on Alice? + - We build up Alice's understanding of software engineering as we automate the collection of data which represents our understanding of it. + - We also teach her how to automate parts of the development process, making contributions and other arbitrary things. + - Over time we'll build up a corpus of training data from which we'll build machine learning models. + - We will eventually introduce feedback loops where these models make decisions about development / contribution actions to be taken when given a codebase. + - We want to make sure that when Alice is deciding what code to write and contribute, that she is following our organizationally applicable policies. As outlined maybe in part via our threat model. +- Who is working on Alice? + - The DFFML community and anyone and everyone who would like to join us. + - Our objective is to build Alice with transparency, freedom, privacy, security, and egalitarianism as critical factors in her strategic principles. +- How does one get involved? + - You can get involved by engaging with the DFFML community via the following links + - Every time we contribute new functionality to Alice we write a tutorial on how that functionality can be extended and customized. + - We would love if you joined us in teaching Alice something about software development, or anything, and teaching others in the process. + - It's as easy writing a single function and explaining your thought process. + - The link on the left will take you to the code and tutorials. + - We are also looking for folks who would like to contribute from by brainstorming and thinking about AI and especially AI ethics. + - The link on the right will take you a document we are collaboratively editing and contributing to. +- Now for a status update. (Progress to date) + - Alice can make contributions, we've laid the foundations for the automation of the software development process. + - Our next step is to help her understand what she's looking at, what is the code, how can she use the source Luke? +- Plans + - As such our top priorities right now are + - Ensuring the contribution process to what exists (`alice please contribute`) is rock solid. + - Building out and making `alice shouldi contribute` accessible and ready for contribution. + - Engaging with those that are collecting metrics (https://metrics.openssf.org) and ensuring our work on metric collection bears fruit. + - Following our engagement on the metric collection front we will preform analysis to determine how to best target further `alice please contribute` efforts and align the two with a documented process on how we select high value targets so that others can pick up and run with extending. + - Participating organizations in parallel begin automated outreach via Alice please contribute + - Later we'll get into more details on the dynamic analysis portion of the Trinity, where we'll work, over time, across many program executions of the code we are working on, to understand how it's execution maps to the work that we're doing via our understanding of what we've done (`please contribute`) and what we we're doing it on (`alice shouldi contribute`). +- Unused + - Alice's contribution docs have live for about a month. We're currently focused on making sure the contribution process works and is clear. Any and all feedback is appreciated. + - After we're sure that Alice's contribution docs are solid we'll begin focus on her data mining capabilities. + - We are building the entity at the center of the software/ entity analysis Trinity, Alice. + - The `alice please contribute` command falls under the Static Analysis point on the trinity. + - The Open Architecture, IETF SCITT, Web5, SBOM and other formats are all are used or plan to be used in top portion, Intent. + - We are building the entity using the architecture. The intermediate and serialized forms of the Open Architecture will be use the represent the findings of our static and dynamic analysis. +- TODO + - [x] Slide Deck + +### Rolling Alice: 2022 Progress Reports: August Activities Recap + +- Metadata + - Date: 2022-08-30 10:00 UTC -7 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0014/reply_0001.md b/docs/discussions/alice_engineering_comms/0014/reply_0001.md new file mode 100644 index 0000000000..fe955c7614 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0014/reply_0001.md @@ -0,0 +1 @@ +https://github.com/opensbom-generator/spdx-sbom-generator \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0014/reply_0002.md b/docs/discussions/alice_engineering_comms/0014/reply_0002.md new file mode 100644 index 0000000000..5c8df876e4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0014/reply_0002.md @@ -0,0 +1 @@ +https://huggingface.co/spaces/huggingface/diffuse-the-rest \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0015/index.md b/docs/discussions/alice_engineering_comms/0015/index.md new file mode 100644 index 0000000000..969e3ba9f3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0015/index.md @@ -0,0 +1,4 @@ +# 2022-08-31 Engineering Logs + +- SCITT + - https://github.com/ietf-scitt/charter/pull/21 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0015/reply_0000.md b/docs/discussions/alice_engineering_comms/0015/reply_0000.md new file mode 100644 index 0000000000..00eaeb6637 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0015/reply_0000.md @@ -0,0 +1,110 @@ +## 2022-08-31 @pdxjohnny Engineering Logs + +- Game plan + - [ ] `alice please contribute` + - [x] README + - [x] CONTRIBUTING + - [x] CODE_OF_CONDUCT + - https://www.youtube.com/watch?v=u2lGjMMIlAo&list=PLtzAOVTpO2ja6DXSCzoF3v_mQDh7l0ymH + - https://github.com/intel/dffml/commit/6c1719f9ec779a9d64bfb3b364e2c41c5ac9aab7 + - [ ] SECURITY + - [ ] SUPPORT + - [ ] CITATION.cff + - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files + - auto populate with 000 UUIDs + - [ ] CODEOWNERS + - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners + - [ ] Demo on stream of how write install and publish a third party overlay + - Have the overlay be a function which outputs a return type of `ContributingContents` and takes the name of the project given in a `CITATIONS.cff` file as another our open source guide example. + - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files + - https://github.com/johnlwhiteman/living-threat-models/blob/c027d4e319c715adce104b95f1e88623e02b0949/CITATION.cff + - https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=5&t=2303 + - https://github.com/intel/dffml/blob/9aeb7f19e541e66fc945c931801215560a8206d7/entities/alice/alice/please/contribute/recommended_community_standards/contributing.py#L48-L54 + - [ ] Demo on stream how to write install and contribute a 1st/2nd party overlay, the same code just not third party, from start to finish. + - CITATION.cff + - [ ] `alice shouldi contribute` + - [ ] Support caching / import / export dataflows + - [ ] Support query in easy way (graphql) + - [ ] Support joining with previous runs / more sets of data + - [ ] Contribute the data OpenSSF cares about to their DB via applicable joins and queries + - [ ] Email Christine and CRob +- TODO + - [ ] Organization + - [ ] Daily addition by Alice to engineering log following template + - [ ] Addition of old TODOs yesterday's logs + - [ ] Export end state of input network / dump everything used by orchestrator + - [ ] pickle + - [ ] JSON + - [ ] Ensure import works (check for state reset in `__aenter__()`, we probably need a generic wrapper to save the memory ones which populates after the `__aenter__()` of the wrapped object. + - [ ] GraphQl query of cached state using strawberry library or something like that + - [ ] Example docs for how to run a flow, then merge with static data as the start state for the cache and then query the whole bit with graphql +- TODO + - [x] Splice out Code of Conduct contribution demo from July progress report video + - [x] Add PR and reference PR as example in tutorial along with spliced out `alice please contribute recommended community standards` contribution demo clip + - [ ] How to Publish an Alice Overlay + - [ ] How to Contribute an Alice Overlay + - [ ] Rolling Alice: 2022 Progress Reports: August Activities Recap + +--- + +### How to Publish an Alice Overlay + +- Metadata + - Date: 2022-08-30 10:00 UTC -7 +- Docs we are following + - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst + - https://github.com/intel/dffml/tree/alice/entities/alice#recommend-community-standards + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md + +### How to Contribute an Alice Overlay + +- Metadata + - Date: 2022-08-30 10:00 UTC -7 + +### Rolling Alice: 2022 Progress Reports: August Activities Recap + +- Metadata + - Date: 2022-08-30 10:00 UTC -7 + +--- + +- Failed attempt to get class defined variables with op decorated functions defined in dataflow classes + - Can't inspect class currently being defined. Can't find the `NewType` references + +```diff +diff --git a/dffml/df/base.py b/dffml/df/base.py +index 4f84c1c7c8..df83d7b612 100644 +--- a/dffml/df/base.py ++++ b/dffml/df/base.py +@@ -345,7 +345,30 @@ def op( + + forward_refs_from_cls = None + if hasattr(func, "__qualname__") and "." in func.__qualname__: ++ ++ def stack_feedface(max_depth=4): ++ from pprint import pprint ++ # Grab stack frames ++ frames = inspect.stack() ++ for i, frame_info in enumerate(frames): ++ pprint(frame_info) ++ breakpoint() ++ continue ++ if max_depth != -1 and i >= max_depth: ++ break ++ if ( ++ frame_info.function == method_name ++ and "self" in frame_info.frame.f_locals ++ and frame_info.frame.f_locals["self"] is obj ++ ): ++ return True ++ return False ++ + # Attempt to lookup type definitions defined within class ++ if func.__qualname__.split(".")[0] == "OverlayCODEOWNERS": ++ stack_feedface() ++ breakpoint() ++ + forward_refs_from_cls = getattr( + sys.modules[func.__module__], + func.__qualname__.split(".")[0], +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0016/index.md b/docs/discussions/alice_engineering_comms/0016/index.md new file mode 100644 index 0000000000..b079eabf10 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0016/index.md @@ -0,0 +1,16 @@ +# 2022-09-01 Engineering Logs + +- Community + - “Heros are not giant statues framed against a red sky. They are people who say this is my community, and it’s my responsibility to make it better.” [Oregon Governor Tom McCall] +- WebUI + - https://jsoncrack.com/editor + - We could leverage JSON Crack to provide easy editing of seed data + - Cloud fork and extend the JSON Crack project to add support for visualizing dataflows + - Previously when using react-flow (https://github.com/wbkd/react-flow) we had used mermaid output SVG cords to find where to place nodes, we could probably just pull that code out of mermaid + - We could do something like the Intuitive and Accessible Documentation Editing GSoC 2022 project where we swap out the mermaid diagram for the extended version of the JSON Crack editor to make the operations in the nodes editable. This is helpful when using operations such as `run_dataflow()` which can have alternate inputs. Any operation defined as a class `OperationImplementation`/`OperationImplementationContext` within the `run()` method of the context we can take the inputs as a dictionary as an argument. + +![image](https://user-images.githubusercontent.com/5950433/187969698-2d572d99-9f20-4618-b1bb-086add503f7e.png) + +![image](https://user-images.githubusercontent.com/5950433/187969864-3b38fcb4-de02-4e47-b57e-f8a62f0f8f11.png) + +![image](https://user-images.githubusercontent.com/5950433/187970084-ab027823-efce-4d42-8146-6b7caf12f328.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0016/reply_0000.md b/docs/discussions/alice_engineering_comms/0016/reply_0000.md new file mode 100644 index 0000000000..f3422c25e3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0016/reply_0000.md @@ -0,0 +1,71 @@ +## 2022-09-01 @pdxjohnny Engineering Logs + +- Game plan + - [ ] `alice please contribute` + - [x] README + - [x] CONTRIBUTING + - [x] CODE_OF_CONDUCT + - https://www.youtube.com/watch?v=u2lGjMMIlAo&list=PLtzAOVTpO2ja6DXSCzoF3v_mQDh7l0ymH + - https://github.com/intel/dffml/commit/6c1719f9ec779a9d64bfb3b364e2c41c5ac9aab7 + - [ ] SECURITY + - [ ] SUPPORT + - [ ] .gitignore + - Dump files add common ignores, collect all inputs derived from file name and of type `GitIgnoreLine` using `group_by` in output flow + - [ ] CITATION.cff + - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files + - auto populate with 000 UUIDs + - [ ] CODEOWNERS + - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners + - [ ] Demo on stream of how write install and publish a third party overlay + - Have the overlay be a function which outputs a return type of `ContributingContents` and takes the name of the project given in a `CITATIONS.cff` file as another our open source guide example. + - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files + - https://github.com/johnlwhiteman/living-threat-models/blob/c027d4e319c715adce104b95f1e88623e02b0949/CITATION.cff + - https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=5&t=2303 + - https://github.com/intel/dffml/blob/9aeb7f19e541e66fc945c931801215560a8206d7/entities/alice/alice/please/contribute/recommended_community_standards/contributing.py#L48-L54 + - [ ] Demo on stream how to write install and contribute a 1st/2nd party overlay, the same code just not third party, from start to finish. + - CITATION.cff + - [ ] `alice shouldi contribute` + - [ ] Support caching / import / export dataflows + - [ ] Support query in easy way (graphql) + - [ ] Support joining with previous runs / more sets of data + - [ ] Contribute the data OpenSSF cares about to their DB via applicable joins and queries + - [ ] Email Christine and CRob +- TODO + - [ ] Organization + - [ ] Daily addition by Alice to engineering log following template + - [ ] Addition of old TODOs yesterday's logs + - [ ] Export end state of input network / dump everything used by orchestrator + - [ ] pickle + - [ ] JSON + - [ ] Ensure import works (check for state reset in `__aenter__()`, we probably need a generic wrapper to save the memory ones which populates after the `__aenter__()` of the wrapped object. + - [ ] GraphQl query of cached state using strawberry library or something like that + - [ ] Example docs for how to run a flow, then merge with static data as the start state for the cache and then query the whole bit with graphql +- TODO + - [ ] Sidestep failure to wrap with `@op` decorator on + - [ ] `with dffml.raiseretry():` around `gh` grabbing issue title + - Avoid potential resource not available yet after creation server side + - [ ] `try: ... catch exception as error: raise RetryOperationException from error` in `run` (above `run_no_retry()`) + - [ ] How to Publish an Alice Overlay + - [ ] How to Contribute an Alice Overlay + - [ ] Rolling Alice: 2022 Progress Reports: August Activities Recap + +--- + +### How to Publish an Alice Overlay + +- Metadata + - Date: 2022-08-30 10:00 UTC -7 +- Docs we are following + - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst + - https://github.com/intel/dffml/tree/alice/entities/alice#recommend-community-standards + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md + +### How to Contribute an Alice Overlay + +- Metadata + - Date: 2022-08-30 10:00 UTC -7 + +### Rolling Alice: 2022 Progress Reports: August Activities Recap + +- Metadata + - Date: 2022-08-30 10:00 UTC -7 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0016/reply_0001.md b/docs/discussions/alice_engineering_comms/0016/reply_0001.md new file mode 100644 index 0000000000..8391288832 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0016/reply_0001.md @@ -0,0 +1,10 @@ +## GSoC 2022: Intuitive and Accessible Documentation Editing: Checkpoint Meeting + +- https://github.com/intel/dffml/issues/1319 +- .gitpod.yml + - https://github.com/pfmoore/editables + - PEP 660 Fallout + - https://github.com/pfmoore/editables/issues/21 + - https://github.com/intel/dffml/issues/1412 + - Trying to `dffml service dev docs` with JS to do `localstorage` tricks + - Got gitpod env up and running and docs building and button auto adding on page load \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0017/index.md b/docs/discussions/alice_engineering_comms/0017/index.md new file mode 100644 index 0000000000..a028acf7e3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0017/index.md @@ -0,0 +1,27 @@ +# 2022-09-02 Engineering Logs + +- SCITT + - Explainer on IETF Supply Chain Integrity, Transparency, and Trust (SCITT) working group + - The proposed SCITT charter sets two goals: + - Standardize the overall security flows for securing a software supply chain, covering the essential building blocks that make up the architecture, and + - specify these building blocks, employing the existing work already done within other IETF WGs such as COSE WG, and IETF RATS WG, as appropriate. + - This is an example Use Case doc: https://github.com/ietf-scitt/use-cases/blob/main/hardware_microelectronics.md which might help as a quick primer to help understand what SCITT is about. + - Here is the draft SCITT charter for background: https://datatracker.ietf.org/doc/charter-ietf-scitt/ + - Here is the draft SCITT architecture: https://datatracker.ietf.org/doc/draft-birkholz-scitt-architecture/ + - Here is a recent mailing list email with more context: https://mailarchive.ietf.org/arch/msg/scitt/ZefYIxvkC_I-sgXETVoJeaYwFB4/ + - The charter has been currently scoped to software, but there are folks thinking about how it could be extended to other areas following implementation for software. + - We're looking at a combination of SCITT plus overlays for threat modeling and policy as we analyze and communicate data on the software lifecycle for the OpenSSF Identifying Security Threats / Metrics WGs. + - Aligned use cases + - https://github.com/ietf-scitt/use-cases/issues/7 + - https://github.com/ietf-scitt/use-cases/issues/8 + - https://github.com/ietf-scitt/use-cases/issues/4 + - https://github.com/ietf-scitt/use-cases/issues/11 + - https://github.com/ietf-scitt/use-cases/issues/12 +- Completed v2 of Entity/System/Software Analysis Trinity + - [EntityAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9479846/EntityAnalysisTrinity.drawio.xml.txt) + - [EntityAnalysisTrinity.svg](https://user-images.githubusercontent.com/5950433/188203911-3586e1af-a1f6-434a-8a9a-a1795d7a7ca3.svg) + - [EntityAnalysisTrinity.jpg](https://user-images.githubusercontent.com/5950433/188203498-2d7a9f50-ba1b-41ad-84b4-90434d4d9240.jpg) + - [EntityAnalysisTrinity.png](https://user-images.githubusercontent.com/5950433/188203501-45e00b72-1d1e-4dc4-b3ca-3fd445369c8d.png) + - [EntityAnalysisTrinity.pdf](https://github.com/intel/dffml/files/9479847/EntityAnalysisTrinity.drawio.xml.txt.drawio.pdf) + +![EntityAnalysisTrinity drawio xml txt](https://user-images.githubusercontent.com/5950433/188203911-3586e1af-a1f6-434a-8a9a-a1795d7a7ca3.svg) diff --git a/docs/discussions/alice_engineering_comms/0017/reply_0000.md b/docs/discussions/alice_engineering_comms/0017/reply_0000.md new file mode 100644 index 0000000000..5d7cba2114 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0017/reply_0000.md @@ -0,0 +1,247 @@ +## 2022-09-02 @pdxjohnny Engineering Logs + +- Game plan + - [ ] `alice please contribute` + - [x] README + - [x] CONTRIBUTING + - [x] CODE_OF_CONDUCT + - https://www.youtube.com/watch?v=u2lGjMMIlAo&list=PLtzAOVTpO2ja6DXSCzoF3v_mQDh7l0ymH + - https://github.com/intel/dffml/commit/6c1719f9ec779a9d64bfb3b364e2c41c5ac9aab7 + - [ ] SECURITY + - [ ] SUPPORT + - [ ] .gitignore + - Dump files add common ignores, collect all inputs derived from file name and of type `GitIgnoreLine` using `group_by` in output flow + - [ ] CITATION.cff + - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files + - auto populate with 000 UUIDs + - [ ] CODEOWNERS + - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners + - [ ] Demo on stream of how write install and publish a third party overlay + - Have the overlay be a function which outputs a return type of `ContributingContents` and takes the name of the project given in a `CITATIONS.cff` file as another our open source guide example. + - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files + - https://github.com/johnlwhiteman/living-threat-models/blob/c027d4e319c715adce104b95f1e88623e02b0949/CITATION.cff + - https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=5&t=2303 + - https://github.com/intel/dffml/blob/9aeb7f19e541e66fc945c931801215560a8206d7/entities/alice/alice/please/contribute/recommended_community_standards/contributing.py#L48-L54 + - [ ] Demo on stream how to write install and contribute a 1st/2nd party overlay, the same code just not third party, from start to finish. + - CITATION.cff + - [ ] `alice shouldi contribute` + - [ ] Support caching / import / export dataflows + - [ ] Support query in easy way (graphql) + - [ ] Support joining with previous runs / more sets of data + - [ ] Contribute the data OpenSSF cares about to their DB via applicable joins and queries + - [ ] Email Christine and CRob +- TODO + - [ ] Organization + - [ ] Daily addition by Alice to engineering log following template + - [ ] Addition of old TODOs yesterday's logs + - [ ] Export end state of input network / dump everything used by orchestrator + - [ ] pickle + - [ ] JSON + - [ ] Ensure import works (check for state reset in `__aenter__()`, we probably need a generic wrapper to save the memory ones which populates after the `__aenter__()` of the wrapped object. + - [ ] GraphQl query of cached state using strawberry library or something like that + - [ ] Example docs for how to run a flow, then merge with static data as the start state for the cache and then query the whole bit with graphql +- TODO + - [ ] Sidestep failure to wrap with `@op` decorator on + - [ ] `with dffml.raiseretry():` around `gh` grabbing issue title + - Avoid potential resource not available yet after creation server side + - [ ] `try: ... catch exception as error: raise RetryOperationException from error` in `run` (above `run_no_retry()`) + - [ ] How to Publish an Alice Overlay + - [ ] How to Contribute an Alice Overlay + - [ ] Rolling Alice: 2022 Progress Reports: August Activities Recap + +--- + +### How to Publish an Alice Overlay + +- Metadata + - Date: 2022-08-30 10:00 UTC -7 +- Docs we are following + - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst + - https://github.com/intel/dffml/tree/alice/entities/alice#recommend-community-standards + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md + +### How to Contribute an Alice Overlay + +- Metadata + - Date: 2022-08-30 10:00 UTC -7 + +### Raise Retry from Exception for Problematic Operations + +- Metadata + - Date: 2022-09-02 11:20 UTC -7 +- `with dffml.raiseretry():` around `gh` grabbing issue title + - Avoid potential resource not available yet after creation server side +- `try: ... catch exception as error: raise RetryOperationException from error` in `run` (above `run_no_retry()`) + +```diff +diff --git a/dffml/df/base.py b/dffml/df/base.py +index 4f84c1c7c..b2d23a678 100644 +--- a/dffml/df/base.py ++++ b/dffml/df/base.py +@@ -15,11 +15,12 @@ from typing import ( + Union, + Optional, + Set, ++ ContextManager, + ) + from dataclasses import dataclass, is_dataclass, replace + from contextlib import asynccontextmanager + +-from .exceptions import NotOpImp ++from .exceptions import NotOpImp, RetryOperationException + from .types import ( + Operation, + Input, +@@ -94,6 +95,7 @@ class OperationImplementationContext(BaseDataFlowObjectContext): + self.parent = parent + self.ctx = ctx + self.octx = octx ++ self.op_retries = None + + @property + def config(self): +@@ -102,6 +104,31 @@ class OperationImplementationContext(BaseDataFlowObjectContext): + """ + return self.parent.config + ++ ++ @contextlib.contextmanager ++ def raiseretry(retries: int) -> ContextManager[None]: ++ """ ++ Use this context manager to have the orchestrator call the operation's ++ ``run()`` method multiple times within the same ++ OperationImplementationContext entry. ++ ++ Useful for ++ ++ TODO ++ ++ - Backoff ++ ++ >>> def myop(self): ++ ... with self.raiseretry(5): ++ ... if self.op_current_retry < 4: ++ ... raise Exception() ++ """ ++ try: ++ yield ++ except Exception as error: ++ raise RetryOperationException(retries) from error ++ ++ + @abc.abstractmethod + async def run(self, inputs: Dict[str, Any]) -> Union[bool, Dict[str, Any]]: + """ +diff --git a/dffml/df/exceptions.py b/dffml/df/exceptions.py +index b1f3bcc87..e185cf22c 100644 +--- a/dffml/df/exceptions.py ++++ b/dffml/df/exceptions.py +@@ -28,3 +28,8 @@ class ValidatorMissing(Exception): + + class MultipleAncestorsFoundError(NotImplementedError): + pass ++ ++ ++class RetryOperationException(Exception): ++ def __init__(self, retires: int) -> None: ++ self.retires = retires +diff --git a/dffml/df/memory.py b/dffml/df/memory.py +index 59286d492..ca0a77cc6 100644 +--- a/dffml/df/memory.py ++++ b/dffml/df/memory.py +@@ -26,6 +26,7 @@ from .exceptions import ( + DefinitionNotInContext, + ValidatorMissing, + MultipleAncestorsFoundError, ++ RetryOperationException, + ) + from .types import ( + Input, +@@ -1187,6 +1188,7 @@ class MemoryOperationImplementationNetworkContext( + ctx: BaseInputSetContext, + octx: BaseOrchestratorContext, + operation: Operation, ++ opctx: OperationImplementationContext, + inputs: Dict[str, Any], + ) -> Union[bool, Dict[str, Any]]: + """ +@@ -1195,9 +1197,7 @@ class MemoryOperationImplementationNetworkContext( + # Check that our network contains the operation + await self.ensure_contains(operation) + # Create an opimp context and run the operation +- async with self.operations[operation.instance_name]( +- ctx, octx +- ) as opctx: ++ with contextlib.nullcontext(): + self.logger.debug("---") + self.logger.debug( + "%s Stage: %s: %s", +@@ -1248,22 +1248,28 @@ class MemoryOperationImplementationNetworkContext( + """ + Run an operation in our network. + """ +- if not operation.retry: +- return await self.run_no_retry(ctx, octx, operation, inputs) +- for retry in range(0, operation.retry): +- try: +- return await self.run_no_retry(ctx, octx, operation, inputs) +- except Exception: +- # Raise if no more tries left +- if (retry + 1) == operation.retry: +- raise +- # Otherwise if there was an exception log it +- self.logger.error( +- "%r: try %d: %s", +- operation.instance_name, +- retry + 1, +- traceback.format_exc().rstrip(), +- ) ++ async with self.operations[operation.instance_name]( ++ ctx, octx ++ ) as opctx: ++ opctx.retries = operation.retry ++ for retry in range(0, operation.retry): ++ try: ++ return await self.run_no_retry(ctx, octx, operation, opctx, inputs) ++ except Exception: ++ if isinstance(error, RetryOperationException): ++ retries = error.retries ++ if not retries ++ raise ++ # Raise if no more tries left ++ if (retry + 1) == retries: ++ raise ++ # Otherwise if there was an exception log it ++ self.logger.error( ++ "%r: try %d: %s", ++ operation.instance_name, ++ retry + 1, ++ traceback.format_exc().rstrip(), ++ ) + + async def operation_completed(self): + await self.completed_event.wait() +diff --git a/entities/alice/alice/please/contribute/recommended_community_standards/readme.py b/entities/alice/alice/please/contribute/recommended_community_standards/readme.py +index 437601358..836d8f175 100644 +--- a/entities/alice/alice/please/contribute/recommended_community_standards/readme.py ++++ b/entities/alice/alice/please/contribute/recommended_community_standards/readme.py +@@ -183,10 +183,11 @@ class OverlayREADME: + """ + Use the issue title as the pull request title + """ +- async for event, result in dffml.run_command_events( +- ["gh", "issue", "view", "--json", "title", "-q", ".title", readme_issue,], +- logger=self.logger, +- events=[dffml.Subprocess.STDOUT], +- ): +- if event is dffml.Subprocess.STDOUT: +- return result.strip().decode() ++ with self.raiseretry(5): ++ async for event, result in dffml.run_command_events( ++ ["gh", "issue", "view", "--json", "title", "-q", ".title", readme_issue,], ++ logger=self.logger, ++ events=[dffml.Subprocess.STDOUT], ++ ): ++ if event is dffml.Subprocess.STDOUT: ++ return result.strip().decode() +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0017/reply_0001.md b/docs/discussions/alice_engineering_comms/0017/reply_0001.md new file mode 100644 index 0000000000..d404a5d5a1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0017/reply_0001.md @@ -0,0 +1 @@ +https://www.cnn.com/2022/09/03/tech/ai-art-fair-winner-controversy/index.html \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0018/index.md b/docs/discussions/alice_engineering_comms/0018/index.md new file mode 100644 index 0000000000..4893d1acf8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0018/index.md @@ -0,0 +1,2 @@ +- TODO + - Messagw Alice on signal to add ti this thread \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0019/index.md b/docs/discussions/alice_engineering_comms/0019/index.md new file mode 100644 index 0000000000..9340256709 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0019/index.md @@ -0,0 +1,8 @@ +# 2022-09-06 Engineering Logs + +- References + - https://madebyoll.in/posts/game_emulation_via_dnn/ + - https://e2eml.school/transformers.html + - Thought: context aware markov + - https://ieeexplore.ieee.org/document/9540871 + - https://twitter.com/konstinx/status/1567036083862396932 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0019/reply_0000.md b/docs/discussions/alice_engineering_comms/0019/reply_0000.md new file mode 100644 index 0000000000..7581f5f99c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0019/reply_0000.md @@ -0,0 +1,41 @@ +## 2022-09-06 @pdxjohnny Engineering Logs + +- User reports need for bypass the validation on insert of each record to mongodb source. + - https://www.mongodb.com/docs/manual/core/schema-validation/bypass-document-validation/ + - > To bypass the validation rules and insert the invalid document, run the following `insert` command, which sets the `bypassDocumentValidation` option to `true`: + > ```javascript + > db.runCommand( { + > insert: "students", + > documents: [ + > { + > name: "Alice", + > year: Int32( 2016 ), + > major: "History", + > gpa: Double(3.0), + > address: { + > city: "NYC", + > street: "33rd Street" + > } + > } + > ], + > bypassDocumentValidation: true + > } ) + > ``` +- References + - https://duckduckgo.com/?q=validation+level+mongodb&t=canonical&ia=web + - https://www.mongodb.com/docs/compass/current/validation/ + - https://www.mongodb.com/docs/manual/core/schema-validation/ + - https://www.mongodb.com/docs/manual/core/schema-validation/specify-validation-level/#std-label-schema-specify-validation-level + - https://www.mongodb.com/docs/manual/core/schema-validation/bypass-document-validation/ +- Updating `MongoDBSource` +- References + - https://duckduckgo.com/?q=motor+mongo+asyncio+bypassDocumentValidation&t=canonical&ia=web + - https://motor.readthedocs.io/en/stable/tutorial-asyncio.html#inserting-a-document + - https://motor.readthedocs.io/en/stable/api-asyncio/asyncio_motor_collection.html#motor.motor_asyncio.AsyncIOMotorCollection.insert_one + - > *bypass_document_validation* requires server version **>= 3.2** + - *bypass_document_validation*: (optional) If `True`, allows the write to opt-out of document level validation. Default is `False`. + - https://github.com/intel/dffml/blob/7627341b66f6209b85ea4ae74e3fb4159d125d30/source/mongodb/dffml_source_mongodb/source.py#L32-L39 + - https://motor.readthedocs.io/en/stable/api-asyncio/asyncio_motor_collection.html#motor.motor_asyncio.AsyncIOMotorCollection.replace_one +- TODO + - [ ] Docs on on open source async first development model in a way which is a quick onramp to the fully connected development model. + - [ ] Allow for user to bypass the validation on insert of each record to mongodb source. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0019/reply_0001.md b/docs/discussions/alice_engineering_comms/0019/reply_0001.md new file mode 100644 index 0000000000..3fc817d97f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0019/reply_0001.md @@ -0,0 +1,3 @@ +## GSoC 2022: Intuitive and Accessible Documentation Editing: Meeting + +- https://github.com/intel/dffml/issues/1392 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0019/reply_0002.md b/docs/discussions/alice_engineering_comms/0019/reply_0002.md new file mode 100644 index 0000000000..752668ca33 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0019/reply_0002.md @@ -0,0 +1 @@ +Recompute / repripirtiize / associte higher priortiiy with markov chains regeneratated from most recently appllicabke context \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0020/index.md b/docs/discussions/alice_engineering_comms/0020/index.md new file mode 100644 index 0000000000..e4e76d588f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0020/index.md @@ -0,0 +1 @@ +# 2022-09-07 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0020/reply_0000.md b/docs/discussions/alice_engineering_comms/0020/reply_0000.md new file mode 100644 index 0000000000..3d5706cb0a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0020/reply_0000.md @@ -0,0 +1,16 @@ +## 2022-09-07 @pdxjohnny Engineering Logs + +- Update Trinity to v3: Fix direction of short loop arrows + - [EntityAnalysisTrinity.svg](https://user-images.githubusercontent.com/5950433/188937161-f107af83-50dd-4deb-a951-1aebf9762a31.svg) + - [EntityAnalysisTrinity.jpg](https://user-images.githubusercontent.com/5950433/188937164-88bd4773-bc37-4c28-ba01-945b6c729f42.jpg) + - [EntityAnalysisTrinity.pdf](https://github.com/intel/dffml/files/9508224/EntityAnalysisTrinity.drawio.xml.txt.drawio.pdf) + - [EntityAnalysisTrinity.png](https://user-images.githubusercontent.com/5950433/188937146-876ada14-60fd-41d6-953b-652099168a22.png) + - [EntityAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9508223/EntityAnalysisTrinity.drawio.xml.txt) + +![EntityAnalysisTrinity.svg](https://user-images.githubusercontent.com/5950433/188937161-f107af83-50dd-4deb-a951-1aebf9762a31.svg) + +- All information will be taggable + - Not all information will be tagged + - We are adding links, like a giant version of Wikipedia +- TODO + - [ ] Deduplicate docs code as we unify operations, data flows, and classes who no longer need separate config dumping code now that everything hooks into the `typing` system. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0020/reply_0001.md b/docs/discussions/alice_engineering_comms/0020/reply_0001.md new file mode 100644 index 0000000000..9fce90ee7f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0020/reply_0001.md @@ -0,0 +1,30 @@ +## SCITT Reference Implementation + +- Goal + - Example graph for one use case +- Search engines auto query RDF JSONLD + - VCs are in RDF by default so you get the graph for free +- Kiran from Microsoft, hardware background +- Orie from Transmute +- IETF goals are to define building blocks and keep it generic +- It makes sense to have a reference implementation + - What level do we want? + - Toy + - Hosted + - Ecosystem +- Let's build code along with the spec +- SCITT building blocks are so far out from sandardisation pro +- Fundamentally supply chain is about peices that interact + - Best hting we can do is workshop +- Transmute is implementing examples to show SCITT will work for hardware as well + - Orie will have some use cases which will have payloads which will have cliams which might be SBOMs + - This way we both mention how SBOM would be a node in the graph so it helps us work out common use cases +- If we had these claims? What kinds of questions could we awnser +- How is an issuer releated to a software artifact, related to a CVe, in a couple example payload formats +- Intent to define example payloads and places to collect them + - Let's have the converstatoin on the mailing list + - Feedback may be that Payload specifics are out of scope for the work + - We still what to talk about what kind of knowledge we want to represent with these opace payloads + - We can start and OpenSSF Use Case doc + - https://github.com/ietf-scitt/use-cases/issues/14 + - John to send out email to mailing list and add ID sec threatds group with to as Mike. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0021/index.md b/docs/discussions/alice_engineering_comms/0021/index.md new file mode 100644 index 0000000000..84b726a1d1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0021/index.md @@ -0,0 +1,3 @@ +# 2022-09-08 Engineering Logs + +- https://github.com/Wilfred/difftastic \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0021/reply_0000.md b/docs/discussions/alice_engineering_comms/0021/reply_0000.md new file mode 100644 index 0000000000..5a62901962 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0021/reply_0000.md @@ -0,0 +1,33 @@ +## 2022-09-08 @pdxjohnny Engineering Logs + +- The Entity Analysis Trinity can help us conceptualize how to manifest realities via it's lifecycle feedback loop over time. + - https://twitter.com/ParissAthena/status/1567690882865926144 +- https://cwarny.medium.com/an-illustration-of-software-2-0-3937f620cea1 + - Rajesh and I talked about how Alice is a hybrid of (what is called in the referenced blog post) "software 1.0" and "software 2.0". + - Alice is a hybrid of software 1.0 and 2.0. We leverage the Open Architecture and Living Threat Models to apply context aware policy to both paradigms. + - It’s important to do depth of field research so that one can understand discourse within a community + - interacting with open source communities explainer? +- https://twitter.com/lorenc_dan/status/1567874273913585665 + - Came across Dan's tweet + - Reminded me of: https://github.com/intel/dffml/issues/1315#issuecomment-1066814280 + - ![Anarchy Elmo Says “Chaos for the Chaos God”](https://user-images.githubusercontent.com/5950433/189168046-a20c0973-b49f-41be-82b5-a66ef53f853d.jpeg) + - Interns having root may be a CISO’s nightmare but it’s Alice’s dream. A learning wonderland. + - Wondered who the chaos god is so did a search + - The God of Chaos is considered the one God + - https://greekgodsandgoddesses.net/gods/chaos/ + - The Hebrew God, also known as the God of knowledge, is also considered the one God + - 110fbeeed4580b05144deea8f2fdbb6793b7f7be +- Finally reading the Alice (#1369) discussion thread again first pass since writing it + - This is what I mean when I say "read the thread": + - `git log --reverse -p --oneline -- docs/arch/alice/discussion/` + - c6a0dafeae527c5e102abd3ee69189cdfb5e9450 + - First mention of the system context was almost immediately, although it wasn't until 2148e16f11a5b5941f19353924ca92e497f81b2a we realized we'd found it + - 3c26ea48b + - > With A/B field testing of new system contexts (changes, running dev branches against dev branches). We start to see a complete picture of the fully connected dev model. We can proactively pair changes from one system context with another system context, both overlayed over a base system context. This is when you have two devs working on two feature branches and both have active PRs. They can now effectively work together because they have this translation, this transparent overlay of their respective diffs to the upstream system context (data flow or program flow in this example). + - https://github.com/intel/dffml/blob/3c26ea48b9d3b66648ef3d676fd015ce171a8761/docs/arch/alice/discussion/0035/reply_0010.md + - Hmmm, we may have stumbled onto the start of the OpenSSF use case doc + - Hey, `git grep` is our friend, let's look for anything talking about CVEs, VEX, vulns, and see if we can scrape together a skeleton use case doc for https://github.com/ietf-scitt/use-cases/issues/14 + - A deal is made: 361555718b5ad589a9430efbd0ed88e7bc0582c3 & 4ef226e2ecd384560d635fa84036003b525ad399 + - Software supply chain + - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0036/reply_0062.md + - \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0022/index.md b/docs/discussions/alice_engineering_comms/0022/index.md new file mode 100644 index 0000000000..5bdcd555b3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0022/index.md @@ -0,0 +1 @@ +# 2022-09-09 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0022/reply_0000.md b/docs/discussions/alice_engineering_comms/0022/reply_0000.md new file mode 100644 index 0000000000..05880eeb50 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0022/reply_0000.md @@ -0,0 +1,4 @@ +## 2022-09-09 @pdxjohnny Engineering Logs + +- https://nightingaledvs.com/how-to-visualize-a-graph-with-a-million-nodes/ +- \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0022/reply_0001.md b/docs/discussions/alice_engineering_comms/0022/reply_0001.md new file mode 100644 index 0000000000..1d66eae589 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0022/reply_0001.md @@ -0,0 +1,14 @@ +- Manifest Schema docs reference addition + - https://medium.com/mcdonalds-technical-blog/mcdonalds-event-driven-architecture-the-data-journey-and-how-it-works-4591d108821f +- Graph million nodes + - https://nightingaledvs.com/how-to-visualize-a-graph-with-a-million-nodes/ + - https://cosmograph.app/ +- How to choose which data visualization to display / generate for slide decks / presentations + - > Data Visualization Types + - https://www.tapclicks.com/resources/blog/data-visualization-types/ + +![FE0CBB03-CF41-4C24-B281-97A7419DB540](https://user-images.githubusercontent.com/5950433/189486866-014dd24a-5f7a-4370-9fbd-d476231fd558.jpeg) + +- A win for shouldi deptree + - > Use data-dist-info-metadata (PEP 658) to decouple resolution from downloading + - https://github.com/pypa/pip/pull/11111 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0022/reply_0002.md b/docs/discussions/alice_engineering_comms/0022/reply_0002.md new file mode 100644 index 0000000000..dc022fd89e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0022/reply_0002.md @@ -0,0 +1,5 @@ +- Great talk from Brooklyn on Edge and Web5 + - https://youtu.be/a6fvZA0L-ok +- Good overview of k8s +- https://huggingface.co/bigscience/bloom + - GPT-3 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0023/index.md b/docs/discussions/alice_engineering_comms/0023/index.md new file mode 100644 index 0000000000..d8263ee986 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0023/index.md @@ -0,0 +1 @@ +2 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0023/reply_0000.md b/docs/discussions/alice_engineering_comms/0023/reply_0000.md new file mode 100644 index 0000000000..f7f8b5bf0f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0023/reply_0000.md @@ -0,0 +1,2 @@ +- L34 through 6 +- L229 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0024/index.md b/docs/discussions/alice_engineering_comms/0024/index.md new file mode 100644 index 0000000000..83461af0db --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0024/index.md @@ -0,0 +1 @@ +# 2022-09-12 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0024/reply_0000.md b/docs/discussions/alice_engineering_comms/0024/reply_0000.md new file mode 100644 index 0000000000..f177ec5bbc --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0024/reply_0000.md @@ -0,0 +1,12 @@ +## 2022-09-12 @pdxjohnny Engineering Logs + +- https://github.com/kubernetes-sigs/image-builder + - https://github.com/imjasonh/kontain.me + - https://github.com/imjasonh/kontain.me/blob/main/pkg/serve/serve.go + - secrets in last layer for k8s orch +- https://twitter.com/pchaigno/status/1439965320056344577?s=20&t=snDh0RTRB1FYmv2AEeIuWQ +- TOOD + - [ ] DataFlow execution within linux loader to do attestation to secret service and set in env before execing `__start` + - configure NFS then mount as volume via preapply. Use this to cache cloned repos and execute pull instead of clone to resolve deltas for iterative scanning over time. + - subflow reuse ictx output operation grab inputs with definitions who are decents of STATIC and CACHED and NFS (eventually NFS and kubernetes stuff should be overlays) + - Threaded execution of sets of contexts \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0025/index.md b/docs/discussions/alice_engineering_comms/0025/index.md new file mode 100644 index 0000000000..31e6d38407 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0025/index.md @@ -0,0 +1,14 @@ +# 2022-09-13 Engineering Logs + +- GSoC 2022 + - https://summerofcode.withgoogle.com/organizations/python-software-foundation/projects/details/4tE547Oz + - https://summerofcode.withgoogle.com/organizations/python-software-foundation/projects/details/gNdNxmFb +- OpenSSF + - SBOM Everywhere + - https://github.com/ossf/sbom-everywhere/issues/12 + - https://docs.google.com/document/d/1iCL7NOSxIc7YpVI2NRANIy46pM-02G_WlPexQqqb2R0/edit + - > - Level 1: clients and SDKs — Operating system and build system-agnostic command line interpreters (CLIs) that can process source and build output artifacts / as well as process operating system and other dependencies. That output a compliant SBOM that includes the necessary data that addresses all use cases. These tools should be able to be run in a manual or automated (e.g., scripted) fashion as part of an end-to-end CI/CD workflow. These tools will include SDKs that developers can use to customize and extend any base tools, for instance to support additional package managers. + > - Level 2: package manager plugins — a set of plugins or modules that work natively with the major package managers and repositories such as Maven, npm, and PyPI. These tools will typically require a single line configuration change added in order to run with each subsequent build and will output compliant SBOMs. This work will enhance the best existing open source plugins where they exist. + > - Level 3: native package manager integration — by adding native SBOM generation functionality to major package managers, all developers and all build systems will automatically generate SBOMs by default as part of their normal workflow. SBOM generation will become as common and seamless as tooling creating log entries for software builds in a log file behind the scenes. + > - Level 4: containerization integration — by adding native SBOM generation functionality to the containerization build process, the system will use SBOM content provided by included packages plus additional artifacts added during container build to output an SBOM that specifies all the components that make up a container. + > - Level 5: application/solution integration/deployment — When deploying an application consisting of multiple disparate components (containers, machine images, event driven services) the coordination manager should aggregate the constituent SBOMS to reflect all artifacts that are deployed. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0025/reply_0000.md b/docs/discussions/alice_engineering_comms/0025/reply_0000.md new file mode 100644 index 0000000000..ba6f0b7c89 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0025/reply_0000.md @@ -0,0 +1,50 @@ +## 2022-09-13 @pdxjohnny Engineering Logs + +```console +$ dffml service dev export alice.cli:ALICE_COLLECTOR_DATAFLOW | tee alice_collector_dataflow.json +$ (date; (echo URL && sed -e 's/^.*/https:\/\/github.com\/dffml\/&/' org-repo-list | head -n 1) | dffml dataflow run records all -no-echo -record-def URL -dataflow alice_collector_dataflow.json -sources src=csv dst=mongodb -source-src-filename /dev/stdin -source-src-key URL -source-dst-uri "${DATABASE_CONNECTION_STRING}" -source-dst-tlsInsecure -source-dst-log_collection_names -source-dst-collection mycollection -orchestrator kubernetes.job -orchestrator-workdir . -log debug -no-strict -orchestrator-max_ctxs 25 -orchestrator-image docker.io/intel-otc/dffml:latest 2>&1; date) | tee ~/alice-shouldi-contribute-mycollection-$(date +%4Y-%m-%d-%H-%M).txt +... +DEBUG:dffml.JobKubernetesOrchestratorContext:context_path.stat().st_size: 60876856 +DEBUG:dffml.JobKubernetesOrchestratorContext:dffml_path.stat().st_size: 157628 +ERROR:dffml.JobKubernetesOrchestratorContext:Traceback for exception=RuntimeError('[\'kubectl\', \'--context\', \'kind-kind\', \'apply\', \'-o=json\', \'-k\', \'.\']: Error from server: error when creating ".": the server responded with the status code 413 but did not return more information (post secrets)\n')> (most recent call last): + File "/src/dffml/dffml/df/kubernetes.py", line 780, in run_operations_for_ctx + raise Exception( + File "/src/dffml/dffml/util/subprocess.py", line 140, in run_command + pass + File "/src/dffml/dffml/util/subprocess.py", line 83, in run_command_events + raise RuntimeError( +RuntimeError: ['kubectl', '--context', 'kind-kind', 'apply', '-o=json', '-k', '.']: Error from server: error when creating ".": the server responded with the status code 413 but did not return more information (post secrets) +Traceback (most recent call last): + File "/home/coder/.local/bin/dffml", line 33, in + sys.exit(load_entry_point('dffml', 'console_scripts', 'dffml')()) + File "/src/dffml/dffml/util/cli/cmd.py", line 282, in main + result = loop.run_until_complete(cls._main(*argv[1:])) + File "/.pyenv/versions/3.9.13/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete + return future.result() + File "/src/dffml/dffml/util/cli/cmd.py", line 248, in _main + return await cls.cli(*args) + File "/src/dffml/dffml/util/cli/cmd.py", line 234, in cli + return await cmd.do_run() + File "/src/dffml/dffml/util/cli/cmd.py", line 211, in do_run + return [res async for res in self.run()] + File "/src/dffml/dffml/util/cli/cmd.py", line 211, in + return [res async for res in self.run()] + File "/src/dffml/dffml/cli/dataflow.py", line 283, in run + async for record in self.run_dataflow( + File "/src/dffml/dffml/cli/dataflow.py", line 268, in run_dataflow + async for ctx, results in octx.run( + File "/src/dffml/dffml/df/memory.py", line 1721, in run + task.result() + File "/src/dffml/dffml/df/kubernetes.py", line 355, in run_operations_for_ctx + await run_command( + File "/src/dffml/dffml/util/subprocess.py", line 137, in run_command + async for _, _ in run_command_events( + File "/src/dffml/dffml/util/subprocess.py", line 83, in run_command_events + raise RuntimeError( +RuntimeError: ['kubectl', '--context', 'kind-kind', 'apply', '-o=json', '-k', '.']: Error from server: error when creating ".": the server responded with the status code 413 but did not return more information (post sec +``` + +- TODO + - [ ] Update Job based Kubernetes Orchestrator to add a note that sometimes a `preapply` is needed to set the limits (required to be set by the namespace?) + - https://github.com/intel/dffml/blob/3e157b391ffc36b6073288d0fe7a21a6a82b55a4/dffml/df/kubernetes.py#L1048-L1108 +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0026/index.md b/docs/discussions/alice_engineering_comms/0026/index.md new file mode 100644 index 0000000000..49d1d1c172 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0026/index.md @@ -0,0 +1,8 @@ +# 2022-09-14 Engineering Logs + +In put networj which resolves or syntehises pipeline orchestrator specifc workflow/job to run data flow effectively using workflow/job syntax as trampoline bacj into dataflow, pull orchestrator secrets applicably + +```console +$ echo -e 'if [[ "x${RUN_ME}" != "x" ]]; then\n ${RUN_ME}\nfi' | RUN_ME='echo hi' bash +hi +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0026/reply_0000.md b/docs/discussions/alice_engineering_comms/0026/reply_0000.md new file mode 100644 index 0000000000..79d34e93c4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0026/reply_0000.md @@ -0,0 +1,21 @@ +- Cattle not pets with state + - Reaching equilibrium with Alice assisted communication faster to bring new nodes into correct place, similar to Graph Neural Network group drone flight work. + +![821D10AA-B705-4667-9F99-98C231BD58A9](https://user-images.githubusercontent.com/5950433/190293910-85bd0d08-0461-400f-8258-16ee161e2a2f.jpeg) + +- shim used with synthesis to manifest ingesting job with matrix to trampoline via orchestrator specific call to index job +- People always have [“right of way”](https://en.m.wikipedia.org/wiki/International_Regulations_for_Preventing_Collisions_at_Sea#Part_B_.E2.80.93_Steering_and_sailing) over machines (example: cars) +- Blames on ths file to the graph with aithors so we know whos most recent point of ckbtact like krnel cc for quering to ask for help (survey) +- How to run a on tmux / ssh entry to shell +- References + - https://www.baeldung.com/linux/remove-last-n-lines-of-file + +```console +$ echo -e 'if [[ "x${RUN_ME}" != "x" ]]; then\n ${RUN_ME}\nfi' | RUN_ME='echo hi' >> ~/.bashrc +$ sed -i "$(( $(wc -l <~/.bashrc)-3+1 )),$ d" ~/.bashrc +$ diff ~/.bashrc ~/.bashrc.bak +173a174,176 +> if [[ "x${RUN_ME}" != "x" ]]; then +> ${RUN_ME} +> fi +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0026/reply_0001.md b/docs/discussions/alice_engineering_comms/0026/reply_0001.md new file mode 100644 index 0000000000..dfe29f78cd --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0026/reply_0001.md @@ -0,0 +1,56 @@ +# Architecting Alice: A Shared Stream of Consciousness + +> Moved to: https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md + +In this tutorial we use dataflow as class to build Input, Redundancy, +Lock, Operation, and Operation Implementation Networks which interact +with ActiveMQ and Redis. These will enable us to overlay them on +Alice's process local memory resident implementations to facilitate +a shared stream of consciousness. + +We then show how two different instances of Alice can communicate where +some operation implementations are instantiated in one process space, +and some in another, we'll then watch them run a simple flow which +print the message "Alice Online" and "Bob Online" to each side. + +```mermaid +graph TD + developer_online --> notify_dev_online +``` + +```python +def developer_online() -> DeveloperOnlineName: + return getpass.getuser() + +def notify_dev_online(developer: DeveloperOnlineName): + print(f"{developer.title() Online") +``` + +Later in Architecting Alice, we'll add in rekor to get data +provenance and put the whole bit behind an HTTP API. We validate data +using SCITT. We could optionally require passes from filter operations. +Could add in more mixins to rekor to check on addition. + +In Coach Alice, we'll see these techniques used to support caching of +complex inputs such as directory trees (creating new inputs on load +by inspecting cached state overlayed). Our work with the OpenSSF +means that we'll want to be scanning lots of VCS (git, svn, etc.) repos. +We'll use this to cache those repos and restore repos from cached state, +then run an update for the delta, then save back to cache. This way +we can avoid running the full download for larger repos. Small repos +we can examine past runs to estimate size and just clone every time +to avoid resource usage of caching. This will building on our Architecting Alice Webhook Based Distributed Compute leveraging Jenkins (~~if rate limit for github doesnt apply to workflow dispatch then build off that~~ https://docs.github.com/en/actions/learn-github-actions/usage-limits-billing-and-administration#usage-limits) and the Manifest concept. + +In Coach Alice we'll also see how we can use this distributed stream +of consciousness to assist with developer communication. We can enable +developers to give Alice dataflows which she runs in the background. +She can then say oh the dev API server restarted (maybe it's your or +your friends laptop running the API, or a real server). This gives +the same impact for both users, a little `notify-send` popup. + +- References + - https://activemq.apache.org/python + - For Python support we recommend the [Python Stomp Client](http://stomp.github.com/implementations.html) + - https://stomp.github.io/implementations.html +- Future + - Notify on diff to discussion thread or git repo with upleveling \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0027/index.md b/docs/discussions/alice_engineering_comms/0027/index.md new file mode 100644 index 0000000000..e931e115e8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0027/index.md @@ -0,0 +1 @@ +# 2022-09-15 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0027/reply_0000.md b/docs/discussions/alice_engineering_comms/0027/reply_0000.md new file mode 100644 index 0000000000..b90bc0f0e6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0027/reply_0000.md @@ -0,0 +1,39 @@ +## 2022-09-15 Open Architecture + +- OA: SCITT for provenance (SPDX DAG for DAG?) plus overlayed (aka generic admission controller, return 0/1) policy. Use example from yesterday, psudo code release flow with checks to SCITT as if it was a BOM/database being added to as the product is built. Come up with places where policy is relevant: incoming vuln, package, sign, release (dont sign unless X, dont release unless Y, new vuln? Run policy check to determine if it effects your arch, take actions (re-roll with updated dep) acrodingly +- Relized SCITT will probably still not define the graph + - Looking for the SPDX DAG work or antyhing like it: https://www.google.com/search?hl=en&q=spdx%20%22dag%22&tbs=qdr%3Am +- References + - https://github.com/git-bom/gitbom-rs/issues/18 + - > There was a discussion in today's GitBOM meeting about the utility of separating generation of gitoids from the generation of a GitBOM DAG. (@)edwarnicke has implemented this split in Go (https://github.com/edwarnicke/gitoid) (WIP) and described it as being a valuable change. The idea is that by splitting this out, other uses of gitoids can be explored. + - https://github.com/edwarnicke/gitoid +- SCITT + - https://github.com/ietf-scitt/charter/blob/master/ietf-scitt-charter.md + - https://github.com/ietf-scitt/use-cases/blob/main/hardware_microelectronics.md + - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture + > ``` + > Artifact + > | + > v +------------------+ + > Issuer -> Statement Envelope | DID Key Manifest | + > \ / | (decentralized) | + > \ / +------------------+ + > \ ______/ | | + > | | | + > v signature | | + > Claim <--------------/ | + > | | + > | Claim +---------+ | + > |------------>| Trans. | | + > Transparency -> +<------------| Registry| / + > Service | Receipt +--------+ X + > v / \ + > Transparent / \ + > Claim / | + > |\ / | + > | \ / | + > | \ / | + > Verifier -> | Verify Claim | + > | | + > Auditor -> Collect Receipts Replay Registry + > ``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0027/reply_0001.md b/docs/discussions/alice_engineering_comms/0027/reply_0001.md new file mode 100644 index 0000000000..5848595f58 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0027/reply_0001.md @@ -0,0 +1,28 @@ +## 2022-09-15 @pdxjohnny Engineering Logs + +- Intuitive and Accessible Docs github device vscoder esq flow +- Removed esoteric text from vol 0 a shell for a ghost + - Loosing ego to find perspective. Stepping into the shoes of others to help us see things from theor perspective helps us understand how to better communicate eoth them (LMWC). We can then use these same principles (what do we think they would think about situation X) to figure out howbest to communicate with ourselves. Thought communication protocol can be used for both communication with other entities and with the self. This chapter we will have to figure out how to dive into this perspective shift. Just talk a little about how we need to truly drop any preconceived notions of who the self is. Because everyone is just a different construct in everyone elses head. There is no one self. Because we exist within the realities of everyone else as well. Which means when the next decision on the collective reality is made, (that tick and tock when we all take the lock will come into play later, when we max out that collective good decision making) we all instantiate effectively as it lives within the actived and deactived signals within the architecture. We never exist again in the same form. We collectively approach infinity by nature of life itself being the only constant we know. Life exists to create more life, it is transport itself, it is the truth we know inside ourself of ourself if we are able to step outside the self and look back at it. This is the shell for the Ghost. The Ghost is you, the soul. The Trinity is therefore the transport (soul, ghost, strategic principles, upstream), entity (self, body, overlayed conscious / cached states), and the architecture (humans, Open Architecture, brain / mind, not sure if orchestrator fits here; possibly when orchestration is bound by underlying description of architecture, the perpetual search(er) for the cleanest architecture: Alice). +- Jenkins + - https://github.com/jenkinsci/jenkinsfile-runner + - Noticed mention of building on Pull request + - Publish incremental releases for pull requests + - https://github.com/jenkinsci/jenkinsfile-runner/releases/tag/1.0-beta-30 + - https://github.com/jenkinsci/jep/tree/master/jep/305 + - https://github.com/jenkinsci/jenkinsfile-runner/pull/525 + - https://github.com/jenkinsci/custom-war-packager/#configuration-file + - Use this to add plugins + - https://github.com/jenkinsci/jenkinsfile-runner/tree/main/demo/cwp + - https://github.com/jenkinsci/jenkinsfile-runner/tree/main/demo/pipeline-as-yaml + - https://github.com/jenkinsci/jenkinsfile-runner/pull/651 + - https://plugins.jenkins.io/pipeline-as-yaml/ + - https://github.com/jenkinsci/custom-war-packager/tree/master/demo/jenkinsfile-runner + - Running this since we have k8s +- Secure software factory + - Goal: Roll container images and publish events to stream of consciousness + - References + - https://github.com/cncf/tag-security/blob/main/supply-chain-security/secure-software-factory/secure-software-factory.md + - https://buildsec.github.io/frsca/ + - https://swagitda.com/blog/posts/security-decision-trees-with-graphviz/ + - https://www.cncf.io/blog/2022/09/14/protect-the-pipe-secure-ci-cd-pipelines-with-a-policy-based-approach-using-tekton-and-kyverno/ + - https://cloudnativesecurityconna22.sched.com/event/1AOkI \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0028/index.md b/docs/discussions/alice_engineering_comms/0028/index.md new file mode 100644 index 0000000000..c0c77fc9ac --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0028/index.md @@ -0,0 +1,3 @@ +# 2022-09-16 + +- John under weather \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0029/index.md b/docs/discussions/alice_engineering_comms/0029/index.md new file mode 100644 index 0000000000..d8263ee986 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0029/index.md @@ -0,0 +1 @@ +2 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0029/reply_0000.md b/docs/discussions/alice_engineering_comms/0029/reply_0000.md new file mode 100644 index 0000000000..586b7a0626 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0029/reply_0000.md @@ -0,0 +1,4 @@ +- Dont forget about the webhooks on all the repos for the central webhook server / stream of consciousness! +- Proxy PyPi extra index to github as a workaround for dependency links? +- https://docs.google.com/document/d/1Ku6y50fY-ZktcUegeCnXLsksEWbaJZddZUxa9z1ehgY/edit +- Still feeling shitty \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0030/index.md b/docs/discussions/alice_engineering_comms/0030/index.md new file mode 100644 index 0000000000..e69de29bb2 diff --git a/docs/discussions/alice_engineering_comms/0030/reply_0000.md b/docs/discussions/alice_engineering_comms/0030/reply_0000.md new file mode 100644 index 0000000000..7e2ce6c990 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0030/reply_0000.md @@ -0,0 +1,2 @@ + +- John still feeling shitty \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0031/index.md b/docs/discussions/alice_engineering_comms/0031/index.md new file mode 100644 index 0000000000..18ed27a2e2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0031/index.md @@ -0,0 +1,4 @@ +# 2022-09-19 Engineering Logs + +- TODO + - [ ] Auto increasing symver via hash of `__code__` of ops \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0031/reply_0000.md b/docs/discussions/alice_engineering_comms/0031/reply_0000.md new file mode 100644 index 0000000000..ff523e62d2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0031/reply_0000.md @@ -0,0 +1,25 @@ +## 2022-09-19 @pdxjohnny Engineering Logs + +- gather and share knowledge +- Configloaders as DataFlow as class add filename to inputs and then also allow for passing + - Idea: DataFlow as Class as function invocation. This would allow you to invoke a python file with only functions. Like kwargs call wraps return of async for run + - import funcname from dffml.call.asyncfunc.dataflow.path + - Oh, were just manually working through the auto refactoring process by starting with the end state +- policy based acceptable risk benefit of the doubt + - be nice, knock and the door shall be opened, karma, pay it forward + - except when risk analysis yields unacceptable results to umbrella/gatekeeper +- Rememeber, we always think in parallel N dimensional interconnected graphs over time + - Align reward to timeline (drop dead dates) to + - Landing many planes at many airports at the same time, how do you reward work so that they all land perfectly timed? + - Look to cooking for insipration on how to make several supply chains some with simialr (interconnections between nodes in graph) data (ingredeiants). Run trials, stream for data retention. Add in ingrediant expiration to account for timeline slip / expiration. + - Is there a way we could incorperate oppertunity cost with this metaphor? + - Cost of food expired - schedule slip + - + - Analyze post stream to build mermaid graphs to or some kind of visualization +- Transparency brings us closer to speed of thought execution +- Project management + - Doc Deck on rewarding alignment for DFFML community to organize + - Source material from thread: + - `grep -i align` + - `grep -i reward` +- first manual taging / labeling / classification for issues, then models \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0031/reply_0001.md b/docs/discussions/alice_engineering_comms/0031/reply_0001.md new file mode 100644 index 0000000000..d8e21693d8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0031/reply_0001.md @@ -0,0 +1,21 @@ +## 2022-09-19 Alice Architecture + +- TODO + - [ ] Write a function that takes a `DataFlow` and produces another `DataFlow` + that is not executable, but is conceptual, an upleveling of the underlying + flow. + - [ ] Write tutorial on how we do this + - [ ] Start with static mapping + - [ ] Operation which inserts operations within dataflow into input network (via return) + - [ ] Optional chains of thought (links between data) can be formed by downstream operations + which take the output of `running_context_dataflow_operations`. The output is of type + `Operation`, `expand` is used on the `@op`. + +```mermaid +graph TD + cli + please_contribute_recommended_community_standards + + cli --> please_contribute_recommended_community_standards + +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0032/index.md b/docs/discussions/alice_engineering_comms/0032/index.md new file mode 100644 index 0000000000..e2c7ed6c5e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0032/index.md @@ -0,0 +1,41 @@ +# 2022-09-20 Engineering Log + +- https://github.com/TheAliceProject + - > The Alice Project at Carnegie Mellon University's Entertainment Technology Center is dedicated to creating tools to teach computer science through creativity. http://alice.org/ +- https://fluxcd.io/blog/2022/08/manage-kyverno-policies-as-ocirepositories/ + - Admission control k8s policy controller with kyverno storing policies as artifacts in oci reg + - Could we have sbom stored as povenace for policy? + - Sbom for policy includes data sets and docs and org contacts +- The cells are working together + - ad-hoc over time (within lifetime tick and tock, mutation/fork/downstream/patched/evolution) distributed by function + - Communication through both peer to peer and central stream of consiousness +- analogy using LTMs and OpenSSF scorecard and LEED certification + - https://support.usgbc.org/hc/en-us/articles/4404406912403-What-is-LEED-certification-#LEED + - Analogy point is focus on time (beyond the onion security model, defense in depth pver tome requires maintainance) +- time for kcp stream! + - https://twitter.com/lorenc_dan/status/1572181327788777476?s=20&t=dvaRWcxul3i94V8vqYMG9A + - Kcp spec as manifest reverse proxy to jenkins + - KCP on top of OpenFaaS managed by ArgoCD + - Alice creates PRs to state config + - SBOMS: https://github.com/opensbom-generator/spdx-sbom-generator/blob/main/examples/modules.json + - DERP (see https://goto.intel.com/devenvdocs deployment engineering logs) +We can use this as the stream proxy (everything speaks HTTP) + +![TrinityCalls](https://user-images.githubusercontent.com/5950433/191273573-c5a805d5-48e9-49cc-aa84-680ded4b401f.gif) + +- Lock established + - Model mixes via Overlays and DataFlow as class + - stable diffusion examples +- Rewarding alignment doc deck + - https://www.sphinx-doc.org/en/master/usage/builders/index.html#sphinx.builders.latex.LaTeXBuilder +- Use case doc +- Need faster way to edit github discussion as markdown + - Could we do `python -m rich.markdown FILENAME` on one side and a reupload on the other? + - Problem: drag and drop pictures + - https://rich.readthedocs.io/en/stable/markdown.html +- https://github.com/guacsec/guac + - Similar to SCITT + - Will collaberate with them + - OA is essentially adding policy to assit with managing lifecycle (patching vulns and retesting downstreams and rereleasing defined in Part / checjed via policy) +- TODO + - [ ] Type up context aware policy notes \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0032/reply_0000.md b/docs/discussions/alice_engineering_comms/0032/reply_0000.md new file mode 100644 index 0000000000..ff2d077440 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0032/reply_0000.md @@ -0,0 +1,2 @@ +- https://w3c-ccg.github.io/meetings/2022-09-20-traceability/ + - Orie in here it looks like \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0033/index.md b/docs/discussions/alice_engineering_comms/0033/index.md new file mode 100644 index 0000000000..9f962e7919 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0033/index.md @@ -0,0 +1,8 @@ +# 2022-09-21 Engineering Logs + +- We are on DevMesh! + - https://devmesh.intel.com/projects/alice +- https://www.linkedin.com/posts/activity-6978347010844225536-2PFL/ +- https://chaoss.community/metrics/ + +![image](https://user-images.githubusercontent.com/5950433/191525098-951bc7fb-dd47-47b2-a8c3-1199500f570d.png) diff --git a/docs/discussions/alice_engineering_comms/0033/reply_0000.md b/docs/discussions/alice_engineering_comms/0033/reply_0000.md new file mode 100644 index 0000000000..4f57604c9c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0033/reply_0000.md @@ -0,0 +1,12 @@ +## 2022-09-21 @pdxjohnny Engineering Log + +- Created profile on DevMesh + - https://devmesh.intel.com/users/john-andersen-641a39/ +- Vol 3 (On Mind Control): Exploiting Bureaucracy: Wording Is Everything + - https://devmesh.intel.com/projects/congress-bill-creator-oneapi-nlp-project#about-section +- Funding model work: For feature requests measure references from other issues to measure downstream impact +- The chaos god provides. It ends not with a bang, but with a + - https://github.com/openai/whisper + - Chaos, down the rabbit hole + - Once again we’ve arrived at the same conclusion. + - atoms flip grep \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0034/index.md b/docs/discussions/alice_engineering_comms/0034/index.md new file mode 100644 index 0000000000..a749977b12 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0034/index.md @@ -0,0 +1,22 @@ +# 2022-09-22 Engineering Logs + +- Gnosticism & The Supreme Reality - Alan Watts + - https://anchor.fm/sabrina-borja/episodes/Gnosticism--The-Supreme-Reality---Alan-Watts-eehqgr + - https://anchor.fm/s/1351bf54/podcast/rss + - https://d3ctxlq1ktw2nl.cloudfront.net/staging/2020-05-25/24a16eaddc18ff58c96e24bee0faf6b8.m4a + - Time for whisper + +```console +$ curl -sfL https://anchor.fm/s/1351bf54/podcast/rss | tee podcasts.rss.xml +$ grep -C 4 '\.m' podcasts.rss.xml | grep -A 5 Gnos + https://anchor.fm/sabrina-borja/episodes/Gnosticism--The-Supreme-Reality---Alan-Watts-eehqgr + 6f19c9d0-5d94-4858-8387-1cec43c39569 + + Mon, 25 May 2020 14:42:18 GMT + + <p>Alan Watts talks about the gnosticism and the supreme reality</p> +``` + +- compute + - to go from the state of unknown to the state of known + - pursuit of knowledge \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0034/reply_0000.md b/docs/discussions/alice_engineering_comms/0034/reply_0000.md new file mode 100644 index 0000000000..477554b9ec --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0034/reply_0000.md @@ -0,0 +1,21 @@ +## 2022-09-22 @pdxjohnny Engineering Logs + +- ashes to ashes dust to dust, from beyond chaos we came and to beyond chaos shall we return. ⌛️ + - Falling through to the other side of the hourglass. + - Remember we've gone down the rabbit hole. + - We'll go out through the looking glass. + +![alice-through-rabbit-hole-eye-of-hourglass](https://user-images.githubusercontent.com/5950433/191897229-0cd824ad-5368-45ce-8f60-c9aa814cdfd0.gif) + +- k8s (job orchestrator, cloud dev envs, etc.) + - https://kubernetes.io/docs/reference/node/kubelet-checkpoint-api/ + - Requires `Kubernetes v1.25 [alpha]` +- [Architecting Alice: Writing the Wave](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0004_writing_the_wave.md) + - https://github.com/intel/dffml/commit/baa1e2b986afb48325be379c60612c9c4aac7651 + - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0023/reply_0055.md +- [Troubleshooting Failed `pip install` Commands](https://github.com/intel/dffml/discussions/1406#discussioncomment-3710985) +- Resources + - Badges + - https://shields.io/ +- Misc. + - Gustav: https://www.lyrics.com/lyric/10511458/Alice%27s+Restaurant \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0034/reply_0001.md b/docs/discussions/alice_engineering_comms/0034/reply_0001.md new file mode 100644 index 0000000000..271266b51e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0034/reply_0001.md @@ -0,0 +1,356 @@ +# Architecting Alice: Writing the Wave + +> Moved to: https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0004_writing_the_wave.md + +> This tutorial was written by echoing examples to the shell, then code blocking the relevant console commands. We're going to use what we build here to allow tutorial writers to either speak the echo commands and we'll later insert them into the asciinema recordings we scrape the commands and outputs from. We could also use the date on the filename we record too plus the offsets to calculate point in time for a given recording. asciicast recordings have new content with a time delta stamp from the last read/write, we probably need to ensure recording are not made with `--idle-time-limit` for this. If we can get streaming working for the lines of asciinema output, critical piece here is ensuring writes are flushed on each line asciinema side, pretty sure this is the case but we need to check. Then we could potentially run these updates markdown comments realtime, Alice doing it sitting alongside of course. + +We want Alice to be as easy to communicate with as possible so +that she can be the most helpful possible. + +We'll be using text to a speech to text model from OpenAI known +as Whisper provide Alice with additional context / input data. +In future tutorials we'll leverage what we teach Alice here + +## The Time is Come for Thee to Reap + +A good friend to us all, John Van Sickle, whose ffmpeg static +builds have saved many of us from an ungodly amount of time +spent in dependency hell. + +We'll be calling on John today, or well, his HTTP server, to +provide us with what we all want, ffmpeg that "just works". +Whisper requires that we have ffmpeg installed and asking John +for a binary is usually the easiest way to make that happen. + +```console +$ curl -sfLOC - https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-amd64-static.tar.xz +$ tar xvf ffmpeg-release-amd64-static.tar.xz +``` + +Move the downloaded files into a user local binary directory, +we're sure to have permissions to write here. + +```console +$ mkdir -p ~/.local/bin/ +$ mv ffmpeg-5.1.1-amd64-static/{ffmpeg,ffprobe,qt-faststart} ~/.local/bin/ +``` + +Add the directory to your `PATH` to ensure you can run the binaries +we put in there. + +```console +$ export PATH="${PATH}:${HOME}/.local/bin" +``` + +Add the PATH modification to the shell's startup scripts to ensure +*new* shells also know where to get those binaries so as to run them. + +```console +$ echo -e 'export PATH="${PATH}:${HOME}/.local/bin"' | tee -a ~/.bashrc ~/.bash_profile +``` + +Try running `ffmpeg`, you should see output similar to the following. + +```console +$ ffmpeg +ffmpeg version 5.1.1-static https://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2022 the FFmpeg developers + built with gcc 8 (Debian 8.3.0-6) + configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gmp --enable-libgme --enable-gray --enable-libaom --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libdav1d --enable-libxvid --enable-libzvbi --enable-libzimg + libavutil 57. 28.100 / 57. 28.100 + libavcodec 59. 37.100 / 59. 37.100 + libavformat 59. 27.100 / 59. 27.100 + libavdevice 59. 7.100 / 59. 7.100 + libavfilter 8. 44.100 / 8. 44.100 + libswscale 6. 7.100 / 6. 7.100 + libswresample 4. 7.100 / 4. 7.100 + libpostproc 56. 6.100 / 56. 6.100 +Hyper fast Audio and Video encoder +usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}... + +Use -h to get full help or, even better, run 'man ffmpeg' +``` + +Thanks again John! + +## Not With a Bang, but With a Whisper + +OpenAI does some cool stuff! They released a model we'll be wrapping +as an operation, first we'll do some basic setup and usage of their +text to speech code / model called Whisper. + +- References + - https://github.com/openai/whisper + - https://github.com/openai/whisper/blob/e90b8fa7e845ae184ed9aa0babcf3cde6f16719e/README.md +- Troubleshooting + - If pytorch/troch fails to download try downloading and installing separately it to see if that helps. + - https://github.com/intel/dffml/discussions/1406#discussioncomment-3710985 + +Check their page for the most up to date information on how to install it. + +```console +$ pip install git+https://github.com/openai/whisper.git +Defaulting to user installation because normal site-packages is not writeable +Collecting git+https://github.com/openai/whisper.git + Cloning https://github.com/openai/whisper.git to /tmp/pip-req-build-1x3f7bij + Running command git clone --filter=blob:none --quiet https://github.com/openai/whisper.git /tmp/pip-req-build-1x3f7bij +o Resolved https://github.com/openai/whisper.git to commit e90b8fa7e845ae184ed9aa0babcf3cde6f16719e + Preparing metadata (setup.py) ... done +Collecting numpy + Using cached numpy-1.23.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.1 MB) +Requirement already satisfied: torch in ./.local/lib/python3.9/site-packages (from whisper==1.0) (1.12.1) +Collecting tqdm + Downloading tqdm-4.64.1-py2.py3-none-any.whl (78 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 78.5/78.5 kB 11.1 MB/s eta 0:00:00 +Collecting more_itertools + Downloading more_itertools-8.14.0-py3-none-any.whl (52 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 52.2/52.2 kB 18.7 MB/s eta 0:00:00 +Collecting transformers>=4.19.0 + Downloading transformers-4.22.1-py3-none-any.whl (4.9 MB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.9/4.9 MB 22.8 MB/s eta 0:00:00 +Collecting ffmpeg-python==0.2.0 + Downloading ffmpeg_python-0.2.0-py3-none-any.whl (25 kB) +Collecting future + Downloading future-0.18.2.tar.gz (829 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 829.2/829.2 kB 51.4 MB/s eta 0:00:00 + Preparing metadata (setup.py) ... done +Requirement already satisfied: packaging>=20.0 in ./.local/lib/python3.9/site-packages (from transformers>=4.19.0->whisper==1.0) (21.3) +Requirement already satisfied: pyyaml>=5.1 in ./.local/lib/python3.9/site-packages (from transformers>=4.19.0->whisper==1.0) (6.0) +Collecting tokenizers!=0.11.3,<0.13,>=0.11.1 + Downloading tokenizers-0.12.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (6.6 MB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.6/6.6 MB 23.8 MB/s eta 0:00:00 +Requirement already satisfied: regex!=2019.12.17 in ./.local/lib/python3.9/site-packages (from transformers>=4.19.0->whisper==1.0) (2022.7.25) +Collecting filelock + Downloading filelock-3.8.0-py3-none-any.whl (10 kB) +Requirement already satisfied: requests in ./.local/lib/python3.9/site-packages (from transformers>=4.19.0->whisper==1.0) (2.28.1) +Collecting huggingface-hub<1.0,>=0.9.0 + Downloading huggingface_hub-0.9.1-py3-none-any.whl (120 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 120.7/120.7 kB 15.8 MB/s eta 0:00:00 +Requirement already satisfied: typing-extensions in ./.local/lib/python3.9/site-packages (from torch->whisper==1.0) (4.3.0) +Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in ./.local/lib/python3.9/site-packages (from packaging>=20.0->transformers>=4.19.0->whisper==1.0) (3.0.9) +Requirement already satisfied: charset-normalizer<3,>=2 in ./.local/lib/python3.9/site-packages (from requests->transformers>=4.19.0->whisper==1.0) (2.1.0) +Requirement already satisfied: idna<4,>=2.5 in ./.local/lib/python3.9/site-packages (from requests->transformers>=4.19.0->whisper==1.0) (3.3) +Requirement already satisfied: certifi>=2017.4.17 in ./.local/lib/python3.9/site-packages (from requests->transformers>=4.19.0->whisper==1.0) (2022.6.15) +Requirement already satisfied: urllib3<1.27,>=1.21.1 in ./.local/lib/python3.9/site-packages (from requests->transformers>=4.19.0->whisper==1.0) (1.26.11) +Building wheels for collected packages: whisper, future + Building wheel for whisper (setup.py) ... done + Created wheel for whisper: filename=whisper-1.0-py3-none-any.whl size=1173962 sha256=2972ec82594a159a312f32a82c755a0aa9d896d2fbcfe4e517d2df89d0ac9dc4 + Stored in directory: /tmp/pip-ephem-wheel-cache-42cy9_3c/wheels/fe/03/29/e7919208d11b4ab32972cb448bb84a9a675d92cd52c9a48341 + Building wheel for future (setup.py) ... done + Created wheel for future: filename=future-0.18.2-py3-none-any.whl size=491058 sha256=8cd76024b97611296081328e7fbcfe960b3b533abba60af5bf5e1ecdd959070d + Stored in directory: /home/coder/.cache/pip/wheels/2f/a0/d3/4030d9f80e6b3be787f19fc911b8e7aa462986a40ab1e4bb94 +Successfully built whisper future +Installing collected packages: tokenizers, tqdm, numpy, more_itertools, future, filelock, huggingface-hub, ffmpeg-python, transformers, whisper +Successfully installed ffmpeg-python-0.2.0 filelock-3.8.0 future-0.18.2 huggingface-hub-0.9.1 more_itertools-8.14.0 numpy-1.23.3 tokenizers-0.12.1 tqdm-4.64.1 transformers-4.22.1 whisper-1.0 +``` + +The model downloads on first load, so we need a one off python +command to trigger the download. This block of code will be +used on operation implementation context entry. + +- References + - https://intel.github.io/dffml/main/examples/shouldi.html#pypi-operations + +```console +$ python -uc 'import whisper; whisper.load_model("base")' +The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`. +Moving 0 files to the new cache system +0it [00:00, ?it/s] +100%|███████████████████████████████████████| 139M/139M [00:02<00:00, 61.9MiB/s] +``` + +Great! The model downloaded using our one off command. + +Let's try running an audio file through for transcription. + +While falling down the rabbit hole we came across an interesting +recording from our good friend, Alan Watts. We'd love to save +knowledge contained in it for easy reference and use later. + +- Gnosticism & The Supreme Reality - Alan Watts + - https://anchor.fm/sabrina-borja/episodes/Gnosticism--The-Supreme-Reality---Alan-Watts-eehqgr + +### RSS feed us the Audio file please and thank you + +[![hack-the-planet](https://img.shields.io/badge/hack%20the-planet-blue)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548) + +From the webpage we found a RSS URL for the podcast. + +- We download the RSS feed + - `curl -sfL https://example.com/rss` +- Filter for `.mp4` or `.mp3` references + - `grep -C 4 '\.m'` +- Filter once more for a word from the title we are looking for + - `grep -A 5 -i Gnosticism` + +```console +$ curl -sfL https://anchor.fm/s/1351bf54/podcast/rss | grep -C 4 '\.m' | grep -C 5 -i Gnosticism + <p>Alan Watts questions if we are still thinking</p> + +--- + +-- + https://anchor.fm/sabrina-borja/episodes/Gnosticism--The-Supreme-Reality---Alan-Watts-eehqgr + 6f19c9d0-5d94-4858-8387-1cec43c39569 + + Mon, 25 May 2020 14:42:18 GMT + + <p>Alan Watts talks about the gnosticism and the supreme reality</p> + +--- + +-- + https://anchor.fm/sabrina-borja/episodes/What-Do-You-Desire----Alan-Watts-eehn6o +``` + +Let's download recording using the URL to the `.m4a` we found. + +```console +$ curl -sfLC - -o alan-watts-gnosticism.m4a https://anchor.fm/s/1351bf54/podcast/play/14264283/https%3A%2F%2Fd3ctxlq1ktw2nl.cloudfront.net%2Fstaging%2F2020-05-25%2F24a16eaddc18ff58c96e24bee0faf6b8.m4a +``` + +We'll double check + +```console +$ file alan-watts-gnosticism.m4a +alan-watts-gnosticism.m4a: ISO Media, MP4 Base Media v1 [IS0 14496-12:2003] +``` + +[![write-the-docs](https://img.shields.io/badge/write%20the-docs-success)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548) + +Calculate the SHA, when we wrote the docs for this we ran the following +command to calculate a cryptographic hash of the contents of the file. +In the next command, we use the hash captured at time of writing the tutorial +and ask the `sha384sum` command to verify that the contents of the file +match the expected hash. + +If you're writing more tutorials for Alice, you'll want to calculate the hash +of a files you use so that others can verify that they downloaded the same file +you did! We don't want anyone to get confused at why something doesn't work, +simply because the file they downloaded didn't have the expected contents! + +```console +$ sha384sum alan-watts-gnosticism.m4a +db9504a15b19bac100093fffe69ce2ab6dd7ed017978c7afcf6ff70db0f288c56b470224e4bcc8b23b927029de13d60a alan-watts-gnosticism.m4a +``` + +[![mindset-security](https://img.shields.io/badge/mindset-security-critical)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548) + +Verify the contents are as expected, you can check the output of the +previous command to make sure the hash you see matches these docs. You +can also run the next command which will fail if the contents are do not +match the hash provided here via `<<<`. + +```console +$ sha384sum -c - <<< 'db9504a15b19bac100093fffe69ce2ab6dd7ed017978c7afcf6ff70db0f288c56b470224e4bcc8b23b927029de13d60a alan-watts-gnosticism.m4a' +alan-watts-gnosticism.m4a: OK +``` + +Now that we have our audio file, let's try transcription. +First we reduce the length of the recording to be transcribed +so that this goes faster. + +```console +$ ffmpeg -t 60 -i alan-watts-gnosticism.m4a -acodec copy alan-watts-gnosticism-first-60-seconds.m4a +``` + +Now we'll ask whisper to transcribe those first 60 seconds for us. +This took about an hour on first run. + +- Troubleshooting + - Troubleshooting Failed Whisper Transcriptions + - https://github.com/intel/dffml/discussions/1406#discussioncomment-3711966 + +```console +$ python -uc 'import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"])' alan-watts-gnosticism-first-60-seconds.m4a +/home/coder/.local/lib/python3.9/site-packages/whisper/transcribe.py:70: UserWarning: FP16 is not supported on CPU; using FP32 instead + warnings.warn("FP16 is not supported on CPU; using FP32 instead") +Detected language: english + + + Of course, what we've been talking about is not so much a set of ideas as an experience, or shall we say, experiencing. And this kind of seminar in comparison with encounter groups or workshops of various kinds or experiments in sensory awareness is now being called a conceptual seminar. Although I'm not talking about concepts, but the crucial question arises that an understanding, a real feeling understanding of the polar relationship between the +``` + +Let's try with the tiny english only model and see if that speeds +things up. + +```console +$ python -uc 'import whisper; whisper.load_model("tiny.en")' +The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`. +Moving 0 files to the new cache system +0it [00:00, ?it/s] +100%|███████████████████████████████████████| 139M/139M [00:02<00:00, 61.9MiB/s] +``` + +We'll add the `language="en"` decode option to our call to +`model.transcribe()`. + +- References + - https://github.com/openai/whisper/blob/e90b8fa7e845ae184ed9aa0babcf3cde6f16719e/whisper/__main__.py#L1-L4 + - https://github.com/openai/whisper/blob/e90b8fa7e845ae184ed9aa0babcf3cde6f16719e/whisper/transcribe.py#L78 + +```console +$ time python -uc 'import sys, whisper; print(whisper.load_model("tiny.en").transcribe(sys.argv[-1], language="en")["text"])' alan-watts-gnosticism-first-60-seconds.m4a +/home/coder/.local/lib/python3.9/site-packages/whisper/transcribe.py:70: UserWarning: FP16 is not supported on CPU; using FP32 instead + warnings.warn("FP16 is not supported on CPU; using FP32 instead") + Of course, what we've been talking about is not so much a set of ideas as an experience, or shall we say experiencing. And this kind of seminar in comparison with encounter groups or workshops of various kinds or experiments in sensory awareness is now being called a conceptual seminar. Although I'm not talking about concepts, but the crucial question arises that an understanding, a real feeling understanding of the polar relationship between the… + +real 15m33.964s +user 4m41.394s +sys 0m14.513s +``` + +## Into the Ether + +Just like us, Alice thinks in parallel. We can't very well +have all Alice's time being spent transcribing audio files. +We need her help with too many things for that. We are about to +teach her how to transcribe for us in the background, using +a different CPU thread. + +At the time of writing this tutorial Alice's orchestration is +able to run concurrent operations but does not transparently +run non-concurrent (no `async`, just a `def`) operations within +threads so as to make them concurrent. + +- References + - https://docs.python.org/3/library/threading.html + +> Eventually the orchestrator will be updated so that it takes op kwargs and decides if it should run it in a thread or not. **TODO** We need an issue to track this. +> - References +> - https://github.com/intel/dffml/issues/245 + +[![use-the-source](https://img.shields.io/badge/use%20the-source-blueviolet)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548) + +There is an example within the DFFML source code which we can pull +from, if only we could find it first... + +Let's head over to a copy of DFFML and look for what we want, any +mention of "thread". + +```console +$ cd /src/dffml +$ git grep -i thread +``` + +In the output we see: + +```console +feature/auth/dffml_feature_auth/feature/operations.py: illustrate threading. 100000 is probably not enough iterations!!! +feature/auth/dffml_feature_auth/feature/operations.py: # we submit to the thread pool. Weird behavior can happen if we raise in +feature/auth/dffml_feature_auth/feature/operations.py: self.pool = concurrent.futures.ThreadPoolExecutor() +``` + +As mentioned by the [Python documentation on threading](https://docs.python.org/3/library/threading.html), +we see the use of [`concurrent.futures.ThreadPoolExecutor`](https://docs.python.org/3/library/concurrent.futures.html#concurrent.futures.ThreadPoolExecutor). + +Our example code is as follows, we'll copy directly from it but replace +the call to `self.hash_password`, a non-concurrent function, with our +transcription function. + +https://github.com/intel/dffml/blob/9f06bae59e954e5fe0845d416500d8418b5907bf/feature/auth/dffml_feature_auth/feature/operations.py#L101-L134 + +- TODO + - [ ] Stream input + - [ ] Stream output + - [ ] Fix + - [ ] Configurable yield break points (via overlay based replacement of op? or config at a minimum similar to `\n` on `StreamReader.readline()`) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0034/reply_0002.md b/docs/discussions/alice_engineering_comms/0034/reply_0002.md new file mode 100644 index 0000000000..fd9c2071d7 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0034/reply_0002.md @@ -0,0 +1,81 @@ +## Troubleshooting Failed `pip install` Commands + +### Context + +Sometimes downloading a package with pip will fail. + +```console +$ ulimit -c unlimited +$ python -m pip download torch +Collecting torch + Downloading torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl (776.4 MB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╸ 776.3/776.4 MB 13.0 MB/s eta 0:00:01Killed +``` + +### Possible Solution: Manual Install of Problematic Python Dependency + +- This troubleshooting solution covers + - Increase memory limit for processes (userspace) + - Find the download URL of a python package + - Download a python package with download resumption + - Verify the contents of the package downloaded using a SHA + - Install package from downloaded wheel + +Look for the path to the download you want. + +```console +$ curl -sfL https://pypi.org/simple/torch/ | grep torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl + torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl
+``` + +Download the package. + +```console +$ curl -fLOC - https://files.pythonhosted.org/packages/1e/2f/06d30fbc76707f14641fe737f0715f601243e039d676be487d0340559c86/torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl + % Total % Received % Xferd Average Speed Time Time Time Current + Dload Upload Total Spent Left Speed +100 740M 100 740M 0 0 85.1M 0 0:00:08 0:00:08 --:--:-- 106M +``` + +Verify the SHA appended to our downloaded URL from our initial command. + +```console +$ sha256sum -c - <<<'9b356aea223772cd754edb4d9ecf2a025909b8615a7668ac7d5130f86e7ec421 torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl' +torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl: OK +``` + +Update the package manager + +```console +$ python -m pip install -U pip setuptools wheel +Defaulting to user installation because normal site-packages is not writeable +Requirement already satisfied: pip in /.pyenv/versions/3.9.13/lib/python3.9/site-packages (22.2.1) +Collecting pip + Downloading pip-22.2.2-py3-none-any.whl (2.0 MB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.0/2.0 MB 10.3 MB/s eta 0:00:00 +Requirement already satisfied: setuptools in /.pyenv/versions/3.9.13/lib/python3.9/site-packages (63.2.0) +Collecting setuptools + Downloading setuptools-65.3.0-py3-none-any.whl (1.2 MB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 16.5 MB/s eta 0:00:00 +Requirement already satisfied: wheel in /.pyenv/versions/3.9.13/lib/python3.9/site-packages (0.37.1) +Installing collected packages: setuptools, pip +Successfully installed pip-22.2.2 setuptools-65.3.0 + +[notice] A new release of pip available: 22.2.1 -> 22.2.2 +[notice] To update, run: pip install --upgrade pip +``` + +Install the package + +```console +$ python -m pip install ./torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl +``` + +Now it should appear to pip as installed. + +```console +$ pip install torch==1.12.1 +Defaulting to user installation because normal site-packages is not writeable +Requirement already satisfied: torch==1.12.1 in ./.local/lib/python3.9/site-packages (1.12.1) +Requirement already satisfied: typing-extensions in ./.local/lib/python3.9/site-packages (from torch==1.12.1) (4.3.0) +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0034/reply_0003.md b/docs/discussions/alice_engineering_comms/0034/reply_0003.md new file mode 100644 index 0000000000..ed83e02e4e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0034/reply_0003.md @@ -0,0 +1,26 @@ +# Rolling Alice: Easter Eggs + +> Moved to https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md + +Easter eggs are scattered throughout the Alice tutorials. Look for these +badges to explore aligned trains of thought. + +## [![write-the-docs](https://img.shields.io/badge/write%20the-docs-success)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548) + +Documentation writing tips, tricks, and alignment recommendations to ensure +we make it easy to write docs and understand how to fill their contents. + +## [![mindset-security](https://img.shields.io/badge/mindset-security-critical)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548) + +Security focused content, pay extra attention here to help keep yourself +and others safe! + +## [![use-the-source](https://img.shields.io/badge/use%20the-source-blueviolet)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548) + +Using existing project's source code in place of documentation when none is +available. + +## [![hack-the-planet](https://img.shields.io/badge/hack%20the-planet-blue)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548) + +Random navigation through systems, file formats, and patterns, that might be +helpful as you're out popping shells. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0034/reply_0004.md b/docs/discussions/alice_engineering_comms/0034/reply_0004.md new file mode 100644 index 0000000000..eb593873e2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0034/reply_0004.md @@ -0,0 +1,99 @@ +# Troubleshooting Failed Whisper Transcriptions + +- Try reducing the length of the recording to be transcribed in event of "Killed" (likely due to out of memory) + +```console +$ ffmpeg -t 60 -i alan-watts-gnosticism.m4a -acodec copy alan-watts-gnosticism-first-60-seconds.m4a +ffmpeg version 5.1.1-static https://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2022 the FFmpeg developers + built with gcc 8 (Debian 8.3.0-6) + configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gmp --enable-libgme --enable-gray --enable-libaom --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libdav1d --enable-libxvid --enable-libzvbi --enable-libzimg + libavutil 57. 28.100 / 57. 28.100 + libavcodec 59. 37.100 / 59. 37.100 + libavformat 59. 27.100 / 59. 27.100 + libavdevice 59. 7.100 / 59. 7.100 + libavfilter 8. 44.100 / 8. 44.100 + libswscale 6. 7.100 / 6. 7.100 + libswresample 4. 7.100 / 4. 7.100 + libpostproc 56. 6.100 / 56. 6.100 +Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'alan-watts-gnosticism.m4a': + Metadata: + major_brand : isom + minor_version : 512 + compatible_brands: isomiso2mp41 + encoder : Lavf58.24.101 + Duration: 00:51:37.36, start: 0.000000, bitrate: 129 kb/s + Stream #0:0[0x1](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default) + Metadata: + handler_name : SoundHandler + vendor_id : [0][0][0][0] +Output #0, ipod, to 'alan-watts-gnosticism-first-60-seconds.m4a': + Metadata: + major_brand : isom + minor_version : 512 + compatible_brands: isomiso2mp41 + encoder : Lavf59.27.100 + Stream #0:0(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default) + Metadata: + handler_name : SoundHandler + vendor_id : [0][0][0][0] +Stream mapping: + Stream #0:0 -> #0:0 (copy) +Press [q] to stop, [?] for help +size= 948kB time=00:01:00.00 bitrate= 129.5kbits/s speed=7.14e+03x +video:0kB audio:938kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.159434% +$ file alan-watts-gnosticism-first-60-seconds.m4a +alan-watts-gnosticism-first-60-seconds.m4a: ISO Media, Apple iTunes ALAC/AAC-LC (.M4A) Audio +$ python -uc 'import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"])' alan-watts-gnosticism-first-60-seconds.m4a +``` + + +```console +$ ps faux +USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND +coder 1 0.0 0.0 751808 9176 ? Ssl Sep19 0:21 ./coder agent +coder 6052 0.0 0.0 6100 4016 pts/12 Ss 16:44 0:00 \_ -bash +coder 6391 34.7 0.2 4647032 731712 pts/12 Rl+ 18:43 5:36 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6520 0.0 0.0 5996 3948 pts/13 Ss 18:56 0:00 \_ -bash +coder 6536 0.0 0.0 7648 3292 pts/13 R+ 18:59 0:00 \_ ps faux +``` + +- Noticed the process is spending a lot of time sleeping. + +```console +$ while test 1; do ps faux | grep whisper | grep -v grep | tee -a mem.txt; sleep 0.2; done +coder 6391 34.4 0.2 4647032 733600 pts/12 Rl+ 18:43 6:27 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.4 0.2 4647032 733600 pts/12 Rl+ 18:43 6:27 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.4 0.2 4647032 733600 pts/12 Sl+ 18:43 6:27 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.4 0.2 4647032 733600 pts/12 Sl+ 18:43 6:28 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.4 0.2 4647032 733600 pts/12 Rl+ 18:43 6:28 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.4 0.2 4647032 733600 pts/12 Sl+ 18:43 6:28 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.4 0.2 4647032 733600 pts/12 Rl+ 18:43 6:28 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.4 0.2 4647032 733600 pts/12 Sl+ 18:43 6:29 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Sl+ 18:43 6:29 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.4 0.2 4647032 733600 pts/12 Rl+ 18:43 6:29 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:29 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.4 0.2 4647032 733600 pts/12 Sl+ 18:43 6:29 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:29 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:30 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:30 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Sl+ 18:43 6:30 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Sl+ 18:43 6:30 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Sl+ 18:43 6:30 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:30 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:31 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:31 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +coder 6391 34.3 0.2 4647032 733600 pts/12 Sl+ 18:43 6:31 | \_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model("base").transcribe(sys.argv[-1])["text"]) alan-watts-gnosticism-first-60-seconds.m4a +``` + +- Some serious OOM happening here (guessing) + +```console +$ time python -uc 'import sys, whisper; print(whisper.load_model("tiny.en").transcribe(sys.argv[-1], language="en")["text"])' alan-watts-gnosticism.m4a +/home/coder/.local/lib/python3.9/site-packages/whisper/transcribe.py:70: UserWarning: FP16 is not supported on CPU; using FP32 instead + warnings.warn("FP16 is not supported on CPU; using FP32 instead") +Killed + +real 1m21.526s +user 0m13.171s +sys 0m12.903s +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0035/index.md b/docs/discussions/alice_engineering_comms/0035/index.md new file mode 100644 index 0000000000..a31fc002d5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0035/index.md @@ -0,0 +1 @@ +# 2022-09-23 Engineering Log \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0035/reply_0000.md b/docs/discussions/alice_engineering_comms/0035/reply_0000.md new file mode 100644 index 0000000000..286958790b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0035/reply_0000.md @@ -0,0 +1,5 @@ +## 2022-09-23 @pdxjohnny Engineering Log + +- [Architecting Alice: Alice OS](https://github.com/intel/dffml/discussions/1406#discussioncomment-3720703) + - WSL kept throwing blue screens on too large downloads :( time to run something Linux based as L0 + - ![elmo-fire-blue-screens-for-Chaos-God](https://user-images.githubusercontent.com/5950433/192104042-385b37f4-06e1-4193-95e7-dd74c30e708a.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0035/reply_0001.md b/docs/discussions/alice_engineering_comms/0035/reply_0001.md new file mode 100644 index 0000000000..5d86bcf6b8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0035/reply_0001.md @@ -0,0 +1,643 @@ +# Architecting Alice: OS DecentrAlice + +> Moved to: https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0006_os_decentralice.md + +Let's build an Operating System! + +- Context + - We need a base of operations from which to build on + as we deploy Alice in various contexts. +- Goals + - We want to end up with something that can be used as a daily driver. +- Actions + - We are going to take userspace from Wolfi and kernel from Fedora. + We'll roll in SSI service binaries to auto start on boot. +- Future work + - We'll see what we can do about TPM support / secure boot. +- References + - This tutorial is covered in `OS DecentrAlice: Rolling an OS` **TODO** Update with link to recording once made. + - The resulting commit from completion of this tutorial was: **TODO** Update with link to operations added. +- Feedback + - Please provide feedback / thoughts for extension / improvement about this tutorial in the following discussion thread: https://github.com/intel/dffml/discussions/1414 + +We will verify that the OS boots under a virtualized environment. + +We will then boot to an arch linux live USB, format a disk, write +the contents of our new operating system to the root partition, +and install a bootloader (can we use systemd?). + +We'll leverage QEMU for our virtualized environment and +Dockerfiles to define the OS image contents. + +- Arch Linux Live @ `/` + - Wofli @ `/mnt` + - Fedora @ `/mnt/fedora` + +## Base Image Dockerfile + +```Dockerfile +# OS DecentrAlice Base Image Dockerfile +# Docs: https://github.com/intel/dffml/discussions/1406#discussioncomment-3720703 + +# Download and build the Self Soverign Identity Service +FROM cgr.dev/chainguard/wolfi-base AS build-ssi-service +RUN apk update && apk add --no-cache --update-cache curl go + +RUN curl -sfL https://github.com/TBD54566975/ssi-service/archive/refs/heads/main.tar.gz \ + | tar xvz \ + && cd /ssi-service-main \ + && go build -tags jwx_es256k -o /ssi-service ./cmd + +# Download the Linux kernel and needed utils to create bootable system +FROM registry.fedoraproject.org/fedora AS build-linux-kernel + +RUN mkdir -p /build/kernel-core-rpms \ + && source /usr/lib/os-release \ + && dnf -y install \ + --installroot=/build/kernel-core-rpms \ + --releasever="${VERSION_ID}" \ + kernel-core \ + kernel-modules \ + systemd \ + systemd-networkd \ + systemd-udev \ + dracut \ + binutils \ + strace \ + kmod-libs + +# First PATH addition +# Add Fedora install PATHs to image environment +RUN mkdir -p /build/kernel-core-rpms/etc \ + && echo "PATH=\"\${PATH}:${PATH}:/usr/lib/dracut/\"" | tee /build/kernel-core-rpms/etc/environment + +# Configure the OS +FROM cgr.dev/chainguard/wolfi-base + +# Install SSI Service +COPY --from=build-ssi-service /ssi-service /usr/bin/ssi-service + +# Install Linux Kernel +# TODO Hardlink kernel paths +COPY --from=build-linux-kernel /build/kernel-core-rpms /fedora + +# Second PATH addition +# Add Wofli install PATHs to image environment +RUN source /fedora/etc/environment \ + && echo "PATH=\"${PATH}\"" | tee /etc/environment /etc/environment-wofli + +# Patch dracut because we could not find what package on Wolfi provides readlink +# RUN sed -i 's/readonly TMPDIR.*/readonly TMPDIR="$tmpdir"/' /freusr/bin/dracut + +# Run depmod to build /lib/modules/${KERNEL_VERSION}/modules.dep which is +# required by dracut for efi creation. +RUN chroot /fedora /usr/bin/bash -c "depmod $(ls /fedora/lib/modules) -a" + +# TODO(security) Pinning and hash validation on get-pip +RUN apk update && apk add --no-cache --update-cache \ + curl \ + bash \ + python3 \ + sed \ + && curl -sSL https://bootstrap.pypa.io/get-pip.py -o get-pip.py \ + && python get-pip.py + +RUN echo 'mount /dev/sda1 /mnt/boot' | tee /fedora-dracut.sh \ + && echo 'swapon /dev/sda2' | tee -a /fedora-dracut.sh \ + && echo 'mkdir -p /mnt/{proc,dev,sys}' | tee -a /fedora-dracut.sh \ + && echo 'mkdir -p /mnt/var/tmp' | tee -a /fedora-dracut.sh \ + && echo 'mkdir -p /mnt/fedora/var/tmp' | tee -a /fedora-dracut.sh \ + && echo "cat > /mnt/fedora/run-dracut.sh <<'LOL'" | tee -a /fedora-dracut.sh \ + && echo 'export PATH="${PATH}:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/lib/dracut/"' | tee -a /fedora-dracut.sh \ + && echo 'export KERNEL_VERSION="$(ls /lib/modules)"' | tee -a /fedora-dracut.sh \ + && echo 'bash -xp /usr/bin/dracut --uefi --kver ${KERNEL_VERSION} --kernel-cmdline "console=ttyS0 root=/dev/sda3"' | tee -a /fedora-dracut.sh \ + && echo 'LOL' | tee -a /fedora-dracut.sh \ + && echo 'arch-chroot /mnt/fedora /bin/bash run-dracut.sh' | tee -a /fedora-dracut.sh \ + && echo 'bootctl --esp-path=/mnt/boot install' | tee -a /fedora-dracut.sh \ + && echo 'for file in $(find /mnt/fedora/boot/); do cp -v $file $(echo $file | sed -e "s/fedora//" -e "s/efi\/EFI/EFI/"); done' | tee -a /fedora-dracut.sh + +RUN rm /sbin/init \ + && ln -s /fedora/lib/systemd/systemd /sbin/init + +# Install Alice +# ARG ALICE_STATE_OF_ART=0c4b8191b13465980ced3fd1ddfbea30af3d1104 +# RUN python3 -m pip install -U setuptools pip wheel +# RUN python3 -m pip install \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml" \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-feature-git&subdirectory=feature/git" \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=shouldi&subdirectory=examples/shouldi" \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-config-yaml&subdirectory=configloader/yaml" \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-operations-innersource&subdirectory=operations/innersource" \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=alice&subdirectory=entities/alice" + +ENTRYPOINT bash +``` + +### SSI Service + +- TODO + - [ ] User systemd socket and service for `/etc/skel` (the place copied from when using `useradd -m`) + + +### Systemd + +**TODO** Currently systemd is within the fedora chroot which causes issues +with it's default library search path on load. + +We could try going any of the following routes next or combination thereof. + +- Wrapper exec on systemd to set `LD_LIBRARY_PATH` before exec + - Possibly with all libs explicitly set (`.so` files) to their location within + the Fedora chroot (`/mnt/fedora` currently). +- Separate Partitions + - Chroot on getty / docker / k3s start (once we get there) + - We haven't messed with docker / k3s yet (something to run containers from Wofli) +- Overlayfs? + - Not sure if this might be helpful here + - Something something systemd target / service to mount Wolfi over Fedora and then chroot? + +STATE_OF_THE_ART: Error bellow for systemd failure to load `.so`'s + +``` + Starting initrd-switch-root.service - Switch Root... +[ 7.926443] systemd-journald[229]: Received SIGTERM from PID 1 (systemd). +[ 8.036984] Kernel panic - not syncing: Attempted to kill init! exitcode=0x00007f00 +[ 8.037936] CPU: 0 PID: 1 Comm: init Not tainted 5.19.10-200.fc36.x86_64 #1 +[/ s b 8in./0i37n93i6t]: Hearrdrwaore name: QEMU Standard PC (i440FX + PIIX, 1996), BIOS 0.0.0 02/06/2015 +[ 8.037936] Call Trace: +... +[ 8.131416]
+r while loading shared libraries: libsystemd-shared-250.so: cannot open shared object file: No such file or directory +``` + +## Installation in VM + +- Using DigitalOcean Fedora host with QEMU installed (`dnf -y install qemu`) + - First boot and install via arch PXE + - Mount root partition + - `# mount /dev/sda3 /mnt` + - Install bootloader + - `# bash -x /mnt/fedora/run-dracut.sh` + - Then reboot without PXE to boot into system +- TODO Piggy Back off arch linux install guide + - https://wiki.archlinux.org/title/Installation_guide + +```bash +#!/usr/bin/env bash +set -xeuo pipefail + +# Virtual machine disk image where virtual machine filesystem is stored +VM_DISK=${VM_DISK:-"${HOME}/vm/image.qcow2"} + +# Block device we use as an intermediary to mount the guest filesystem from host +VM_DEV=${VM_DEV:-"/dev/nbd0"} + +# The directory where we mount the guest filesystem on the host for access and +# modification when not in use by the guest +CHROOT=${CHROOT:-"${HOME}/vm/decentralice-chroot"} + +# Extract container image to chroot +IMAGE=${IMAGE:-"localhost/c-distroliess:latest"}; + +container=$(podman run --rm -d --entrypoint tail "${IMAGE}" -F /dev/null); +trap "podman kill ${container}" EXIT + +# Linux kernel command line +CMDLINE=${CMDLINE:-"console=ttyS0 root=/dev/sda3 rw resume=/dev/sda2 init=/usr/bin/init.sh"} + +# Location of qemu binary to use +QEMU=${QEMU:-"qemu-system-x86_64"} + +# Load the network block device kernel module +sudo modprobe nbd max_part=8 + +# Unmount the virtual disk image if it is currently mounted +sudo umount -R "${CHROOT}" || echo "Image was not mounted at ${CHROOT}" +# Disconnect the network block device +sudo qemu-nbd --disconnect "${VM_DEV}" || echo "Image was not connected as nbd" + +mount_image() { + sudo qemu-nbd --connect="${VM_DEV}" "${VM_DISK}" + sudo mount "${VM_DEV}p3" "${CHROOT}" + sudo mount "${VM_DEV}p1" "${CHROOT}/boot" +} + +unmount_image() { + sudo sync + sudo umount -R "${CHROOT}" + sudo qemu-nbd --disconnect "${VM_DEV}" +} + +# Check if the block device we are going to use to mount the virtual disk image +# already exists +if [ -b "${VM_DEV}" ]; then + echo "VM_DEV already exists: ${VM_DEV}" >&2 + # exit 1 +fi + +# Create the virtual disk image and populate it if it does not exist +if [ ! -f "${VM_DISK}" ]; then + mkdir -p "${CHROOT}" + mkdir -p "$(dirname ${VM_DISK})" + + # Create the virtual disk image + qemu-img create -f qcow2 "${VM_DISK}" 20G + + # Use the QEMU guest utils network block device utility to mount the virtual + # disk image as the $VM_DEV device + sudo qemu-nbd --connect="${VM_DEV}" "${VM_DISK}" + # Partition the block device + sudo parted "${VM_DEV}" << 'EOF' +mklabel gpt +mkpart primary fat32 1MiB 261MiB +set 1 esp on +mkpart primary linux-swap 261MiB 10491MiB +mkpart primary ext4 10491MiB 100% +EOF + # EFI partition + sudo mkfs.fat -F32 "${VM_DEV}p1" + # swap space + sudo mkswap "${VM_DEV}p2" + # Linux root partition + sudo mkfs.ext4 "${VM_DEV}p3" + sudo mount "${VM_DEV}p3" "${CHROOT}" + # Boot partiion + sudo mkdir "${CHROOT}/boot" + sudo mount "${VM_DEV}p1" "${CHROOT}/boot" + + # Image to download + podman cp "${container}:/" "${CHROOT}" + + # Unmount the virtual disk image so the virtual machine can use it + unmount_image +fi + +# Mount the guest file system on the host when we exit the guest +trap mount_image EXIT + +if [[ ! -f "$( echo ipxe*.efi)" ]]; then + curl -sfLO https://archlinux.org/static/netboot/ipxe-arch.16e24bec1a7c.efi +fi + +# Only add -kernel for first install +# -kernel ipxe*.efi \ + +"${QEMU}" \ + -smp cpus=2 \ + -m 4096M \ + -enable-kvm \ + -nographic \ + -cpu host \ + -drive file="${VM_DISK}",index=0,media=disk,format=qcow2 \ + -bios /usr/share/edk2/ovmf/OVMF_CODE.fd $@ +``` + +#### Disk Partitioning + +`decentralice.sh` creates a 20 GB virtual disk in QCOW2 format +and formats partitions according to the following example UEFI +recommendations. + +- References + - https://wiki.archlinux.org/title/Installation_guide#Boot_loader + - https://wiki.archlinux.org/title/Installation_guide#Example_layouts + +#### Netboot to Live Install Media + +We download the pxe netboot image and use it to boot to an +Arch Linux live image which is usually used for installing +Arch Linux, but there is no reason we can't use it to install +AliceOS. + +Choose a contry and mirror then modify + +- References + - https://archlinux.org/releng/netboot/ + +```console +$ ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@147.182.254.77 sudo rm -f /root/vm/image.qcow2 +Warning: Permanently added '147.182.254.77' (ECDSA) to the list of known hosts. +Connection to 147.182.254.77 closed. +$ python -m asciinema rec --idle-time-limit 0.5 --title "$(date +%4Y-%m-%d-%H-%M-%ss)" --command "ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@147.182.254.77 sudo bash decentralice.sh -kernel ipxe*.efi" >(xz --stdout - > "$HOME/asciinema/rec-$(hostname)-$(date +%4Y-%m-%d-%H-%M-%ss).json.xz") +``` + +#### Mount Partitions from Live Install Media `root` Shell + +```console +Boot options: ip=dhcp net.ifnames=0 BOOTIF=01-52:54:00:12:34:56 console=ttyS0 + + Arch Linux Netboot + + Settings + Architecture: x86_64 + Release: 2022.09.03 + Mirror: http://mirrors.cat.pdx.edu/archlinux/ + Boot options: ip=dhcp net.ifnames=0 BOOTIF=01-52:54:00:12:34:56 console=tt + + Boot Arch Linux + Drop to iPXE shell + Reboot + Exit iPXE + + + + + + + + + + + +Booting Arch Linux x86_64 2022.09.03 from http://mirrors.cat.pdx.edu/archlinux/ + +http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/x86_64/vmlinuz-linux... ok +http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/x86_64/vmlinuz-linux.ipxe.sig... ok +http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/amd-ucode.img... ok +http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/amd-ucode.img.ipxe.sig... ok +http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/intel-ucode.img... ok +http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/intel-ucode.img.ipxe.sig... ok +http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/x86_64/initramfs-linux.img... ok +http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/x86_64/initramfs-linux.img.ipxe.sig... ok +:: running early hook [udev] +Starting version 251.4-1-arch +:: running early hook [archiso_pxe_nbd] +:: running hook [udev] +:: Triggering uevents... +:: running hook [memdisk] +:: running hook [archiso] +:: running hook [archiso_loop_mnt] +:: running hook [archiso_pxe_common] +IP-Config: eth0 hardware address 52:54:00:12:34:56 mtu 1500 DHCP +IP-Config: eth0 guessed broadcast address 10.0.2.255 +IP-Config: eth0 complete (from 10.0.2.2): + address: 10.0.2.15 broadcast: 10.0.2.255 netmask: 255.255.255.0 + gateway: 10.0.2.2 dns0 : 10.0.2.3 dns1 : 0.0.0.0 + rootserver: 10.0.2.2 rootpath: + filename : +:: running hook [archiso_pxe_nbd] +:: running hook [archiso_pxe_http] +:: running hook [archiso_pxe_nfs] +:: Mounting /run/archiso/httpspace (tmpfs) filesystem, size='75%' +:: Downloading 'http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/x86_64/airootfs.sfs' + % Total % Received % Xferd Average Speed Time Time Time Current + Dload Upload Total Spent Left Speed +100 683M 100 683M 0 0 52.3M 0 0:00:13 0:00:13 --:--:-- 65.9M +:: Downloading 'http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/x86_64/airootfs.sfs.sig' + % Total % Received % Xferd Average Speed Time Time Time Current + Dload Upload Total Spent Left Speed +100 471 100 471 0 0 7009 0 --:--:-- --:--:-- --:--:-- 7136 +:: Signature verification requested, please wait... +[GNUPG:] GOODSIG 044ABFB932C36814 Arch Linux Release Engineering (Ephemeral Signing Key) +Signature is OK, continue booting. +:: Mounting /run/archiso/copytoram (tmpfs) filesystem, size=75% +:: Mounting /run/archiso/cowspace (tmpfs) filesystem, size=256M... +:: Copying rootfs image to RAM... +done. +:: Mounting '/dev/loop0' to '/run/archiso/airootfs' +:: Device '/dev/loop0' mounted successfully. +:: running late hook [archiso_pxe_common] +:: running cleanup hook [udev] + +Welcome to Arch Linux! + +[ 41.600639] I/O error, dev fd0, sector 0 op 0x0:(READ) flags 0x0 phys_seg 1 prio class 0 +[ OK ] Created slice Slice /system/getty. +[ OK ] Created slice Slice /system/modprobe. +[ OK ] Created slice Slice /system/serial-getty. +[ OK ] Created slice User and Session Slice. +[ OK ] Started Dispatch Password …ts to Console Directory Watch. +[ OK ] Started Forward Password R…uests to Wall Directory Watch. +[ OK ] Set up automount Arbitrary…s File System Automount Point. +[ OK ] Reached target Local Encrypted Volumes. +[ OK ] Reached target Local Integrity Protected Volumes. +[ OK ] Reached target Path Units. +... +[ OK ] Started Getty on tty1. +[ OK ] Started Serial Getty on ttyS0. +[ OK ] Reached target Login Prompts. + +Arch Linux 5.19.6-arch1-1 (ttyS0) + +archiso login: root +To install Arch Linux follow the installation guide: +https://wiki.archlinux.org/title/Installation_guide + +For Wi-Fi, authenticate to the wireless network using the iwctl utility. +For mobile broadband (WWAN) modems, connect with the mmcli utility. +Ethernet, WLAN and WWAN interfaces using DHCP should work automatically. + +After connecting to the internet, the installation guide can be accessed +via the convenience script Installation_guide. + + +Last login: Sun Sep 25 23:55:20 on tty1 +root@archiso ~ # mount /dev/sda3 /mnt +root@archiso ~ # bash -x /mnt/fedora-dracut.sh +``` + +- Now without PXE boot + - Currently systemd takes the + +```console +$ python -m asciinema rec --idle-time-limit 0.5 --title "$(date +%4Y-%m-%d-%H-%M-%ss)" --command "ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@147.182.254.77 sudo bash decentralice.sh" >(xz --stdout - > "$HOME/asciinema/rec-$(hostname)-$(date +%4Y-%m-%d-%H-%M-%ss).json.xz") ++ VM_DISK=/root/vm/image.qcow2 ++ VM_DEV=/dev/nbd0 ++ CHROOT=/root/vm/decentralice-chroot ++ IMAGE=localhost/c-distroliess:latest +++ podman run --rm -d --entrypoint tail localhost/c-distroliess:latest -F /dev/null ++ container=1b79597e28cbc714043992a46d0498bd31a449c773784e0fab4629ee11244ce1 ++ trap 'podman kill 1b79597e28cbc714043992a46d0498bd31a449c773784e0fab4629ee11244ce1' EXIT ++ CMDLINE='console=ttyS0 root=/dev/sda3 rw resume=/dev/sda2 init=/usr/bin/init.sh' ++ QEMU=qemu-system-x86_64 ++ sudo modprobe nbd max_part=8 ++ sudo umount -R /root/vm/decentralice-chroot ++ sudo qemu-nbd --disconnect /dev/nbd0 +/dev/nbd0 disconnected ++ '[' -b /dev/nbd0 ']' ++ echo 'VM_DEV already exists: /dev/nbd0' +VM_DEV already exists: /dev/nbd0 ++ '[' '!' -f /root/vm/image.qcow2 ']' ++ trap mount_image EXIT +++ echo ipxe-arch.16e24bec1a7c.efi ++ [[ ! -f ipxe-arch.16e24bec1a7c.efi ]] ++ qemu-system-x86_64 -smp cpus=2 -m 4096M -enable-kvm -nographic -cpu host -drive file=/root/vm/image.qcow2,index=0,media=disk,format=qcow2 -bios /usr/shar +e/edk2/ovmf/OVMF_CODE.fd +BdsDxe: loading Boot0001 "Linux Boot Manager" from HD(1,GPT,5ED5E31E-F9DF-4168-B087-18AB1EF33E24,0x800,0x82000)/\EFI\systemd\systemd-bootx64.efi +BdsDxe: starting Boot0001 "Linux Boot Manager" from HD(1,GPT,5ED5E31E-F9DF-4168-B087-18AB1EF33E24,0x800,0x82000)/\EFI\systemd\systemd-bootx64.efi +EFI stub: Loaded initrd from LINUX_EFI_INITRD_MEDIA_GUID device path +[ 0.000000] Linux version 5.19.10-200.fc36.x86_64 (mockbuild@bkernel01.iad2.fedoraproject.org) (gcc (GCC) 12.2.1 20220819 (Red Hat 12.2.1-2), GNU ld ver +sion 2.37-36.fc36) #1 SMP PREEMPT_DYNAMIC Tue Sep 20 15:15:53 UTC 2022 +[ 0.000000] Command line: console=ttyS0 root=/dev/sda3 +[ 0.000000] x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' +[ 0.000000] x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' +[ 0.000000] x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' +[ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 +[ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. +[ 0.000000] signal: max sigframe size: 1776 +[ 0.000000] BIOS-provided physical RAM map: +... +[ 4.505931] systemd[1]: dracut-pre-udev.service - dracut pre-udev hook was skipped because all trigger condition checks failed. +[ 4.511214] audit: type=1130 audit(1664171381.024:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' +[ 4.521203] systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... + Starting systemd-tmpfiles-…ate Static Device Nodes in /dev... +[ 4.530842] systemd[1]: Started systemd-journald.service - Journal Service. +[ OK ] Started systemd-journald.service - Journal Service. + Starting syste[ 4.543614] audit: type=1130 audit(1664171381.072:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' +md-tmpfiles-… Volatile Files and Directories... +[ OK ] Finished systemd-tmpfiles-…reate Static Device Nodes in /dev. + Starting systemd-udevd.ser…ger for Device Events and Files..[ 4.570653] audit: type=1130 audit(1664171381.095:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' +.[ 4.580930] audit: type=1334 audit(1664171381.097:7): prog-id=6 op=LOAD + +[ 4.596257] audit: type=1334 audit(1664171381.097:8): prog-id=7 op=LOAD +[ 4.596303] audit: type=1334 audit(1664171381.097:9): prog-id=8 op=LOAD +[ OK ] Finished systemd-tmpfiles-…te Volatile Files and Directories. +[ 4.614382] audit: type=1130 audit(1664171381.146:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' +[ OK ] Started systemd-udevd.serv…nager for Device Events and Files. + Starting systemd-udev-trig…[0m - Coldplug All udev Devices... +[ OK ] Finished systemd-udev-trig…e - Coldplug All udev Devices. +[ OK ] Reached target sysinit.target - System Initialization. +[ OK ] Reached target basic.target - Basic System. +[ OK ] Reached target remote-fs-p…eparation for Remote File Systems. +[ OK ] Reached target remote-fs.target - Remote File Systems. +[ OK ] Found device dev-sda3.device - QEMU_HARDDISK primary. +[ OK ] Reached target initrd-root…e.target - Initrd Root Device. + Starting systemd-fsck-root… File System Check on /dev/sda3... +[ OK ] Finished systemd-fsck-root… - File System Check on /dev/sda3. + Mounting sysroot.mount - /sysroot... +[ 5.543281] EXT4-fs (sda3): mounted filesystem with ordered data mode. Quota mode: none. +[ OK ] Mounted sysroot.mount - /sysroot. +[ OK ] Reached target initrd-root…get - Initrd Root File System. + Starting initrd-parse-etc.…onfiguration from the Real Root... +[ OK ] Finished initrd-parse-etc.… Configuration from the Real Root. +[ OK ] Reached target initrd-fs.target - Initrd File Systems. +[ OK ] Reached target initrd.target - Initrd Default Target. + Starting dracut-pre-pivot.…acut pre-pivot and cleanup hook... +[ OK ] Finished dracut-pre-pivot.…dracut pre-pivot and cleanup hook. + Starting initrd-cleanup.se…ng Up and Shutting Down Daemons... +[ OK ] Stopped target timers.target - Timer Units. +[ OK ] Stopped dracut-pre-pivot.s…dracut pre-pivot and cleanup hook. +[ OK ] Stopped target initrd.target - Initrd Default Target. +[ OK ] Stopped target basic.target - Basic System. +[ OK ] Stopped target initrd-root…e.target - Initrd Root Device. +[ OK ] Stopped target initrd-usr-…get - Initrd /usr File System. +[ OK ] Stopped target paths.target - Path Units. +[ OK ] Stopped systemd-ask-passwo…quests to Console Directory Watch. +[ OK ] Stopped target remote-fs.target - Remote File Systems. +[ OK ] Stopped target remote-fs-p…eparation for Remote File Systems. +[ OK ] Stopped target slices.target - Slice Units. +[ OK ] Stopped target sockets.target - Socket Units. +[ OK ] Stopped target sysinit.target - System Initialization. +[ OK ] Stopped target swap.target - Swaps. +[ OK ] Stopped systemd-sysctl.service - Apply Kernel Variables. +[ OK ] Stopped systemd-tmpfiles-s…te Volatile Files and Directories. +[ OK ] Stopped target local-fs.target - Local File Systems. +[ OK ] Stopped systemd-udev-trigg…e - Coldplug All udev Devices. + Stopping systemd-udevd.ser…ger for Device Events and Files... +[ OK ] Stopped systemd-vconsole-s…rvice - Setup Virtual Console. +[ OK ] Finished initrd-cleanup.se…ning Up and Shutting Down Daemons. +[ OK ] Stopped systemd-udevd.serv…nager for Device Events and Files. +[ OK ] Closed systemd-udevd-contr….socket - udev Control Socket. +[ OK ] Closed systemd-udevd-kernel.socket - udev Kernel Socket. + Starting initrd-udevadm-cl…ice - Cleanup udev Database... +[ OK ] Stopped systemd-tmpfiles-s…reate Static Device Nodes in /dev. +[ OK ] Stopped kmod-static-nodes.…reate List of Static Device Nodes. +[ OK ] Finished initrd-udevadm-cl…rvice - Cleanup udev Database. +[ OK ] Reached target initrd-switch-root.target - Switch Root. + Starting initrd-switch-root.service - Switch Root... +[ 7.926443] systemd-journald[229]: Received SIGTERM from PID 1 (systemd). +[ 8.036984] Kernel panic - not syncing: Attempted to kill init! exitcode=0x00007f00 +[ 8.037936] CPU: 0 PID: 1 Comm: init Not tainted 5.19.10-200.fc36.x86_64 #1 +[/ s b 8in./0i37n93i6t]: Hearrdrwaore name: QEMU Standard PC (i440FX + PIIX, 1996), BIOS 0.0.0 02/06/2015 +[ 8.037936] Call Trace: +[ 8.037936] +[ 8.037936] dump_stack_lvl+0x44/0x5c +[ 8.037936] panic+0xfb/0x2b1 +[ 8.037936] do_exit.cold+0x15/0x15 +[ 8.037936] do_group_exit+0x2d/0x90 +[ 8.037936] __x64_sys_exit_group+0x14/0x20 +[ 8.037936] do_syscall_64+0x5b/0x80 +[ 8.037936] ? do_syscall_64+0x67/0x80 +[ 8.037936] entry_SYSCALL_64_after_hwframe+0x63/0xcd +[ 8.037936] RIP: 0033:0x7f9b61282911 +[ 8.037936] Code: f7 d8 89 01 48 83 c8 ff c3 be e7 00 00 00 ba 3c 00 00 00 eb 11 0f 1f 40 00 89 d0 0f 05 48 3d 00 f0 ff ff 77 1c f4 89 f0 0f 05 <48> 3d 00 f0 ff ff 76 e7 f7 d8 89 05 7f 29 01 00 eb dd 0f 1f 44 00 +[ 8.037936] RSP: 002b:00007ffd45b6dc78 EFLAGS: 00000246 ORIG_RAX: 00000000000000e7 +[ 8.037936] RAX: ffffffffffffffda RBX: 00007f9b6128caf8 RCX: 00007f9b61282911 +[ 8.037936] RDX: 000000000000003c RSI: 00000000000000e7 RDI: 000000000000007f +[ 8.037936] RBP: 00007f9b6126017f R08: 00007ffd45b6dc88 R09: 000000006128a000 +[ 8.037936] R10: 0000000000000020 R11: 0000000000000246 R12: 0000000000000002 +[ 8.129077] R13: 0000000000000001 R14: 00007f9b612601a0 R15: 0000000000000000 +[ 8.131416] +r while loading shared libraries: libsystemd-shared-250.so: cannot open shared object file: No such file or directory +[ 8.131416] Kernel Offset: 0x5000000 from 0xffffffff81000000 (relocation range: 0xffffffff80000000-0xffffffffbfffffff) +[ 8.131416] ---[ end Kernel panic - not syncing: Attempted to kill init! exitcode=0x00007f00 ]--- + + + + +QEMU: Terminated +``` + +- TODO + - `--fstab /etc/fstab`? + - Not sure if we need this yet but saving here until dracut we get `EXIT_SUCCESS` + - Add custom bootloader image + - slice image from alice unbirthday gif-2-cli gif and convert to bitmap + - References + - https://man7.org/linux/man-pages/man8/dracut.8.html + - > `--uefi-splash-image ` + > - Specifies the UEFI stub loader’s splash image. Requires + > bitmap (.bmp) image format. + +### Alice + +Install Alice! + +## Misc. + +- TODO + - [ ] Updates for fedora packages (aka kernel) will need to be handled. + - We might just re-roll and pull only the layers with kernel stuff? TBD + - [ ] motd? +- References + - Chainguard + - https://edu.chainguard.dev/chainguard/chainguard-images/how-to-use-chainguard-images/ + - https://edu.chainguard.dev/open-source/melange/getting-started-with-melange/ + - We should use melange and apko and setup a secure factory to build images. + - Images + - https://dnf-plugins-core.readthedocs.io/en/latest/download.html + - https://github.com/srossross/rpmfile + - QEMU + - https://pdxjohnny.github.io/linux-kernel/ + - https://pdxjohnny.github.io/qemu/ + - https://archlinux.org/releng/netboot/ + - https://gist.github.com/pdxjohnny/6063d1893c292d1ac0024fb14d1e627d + - Install Guide + - https://wiki.archlinux.org/title/Installation_guide + - https://archlinux.org/releng/netboot/ + - https://wiki.archlinux.org/title/Installation_guide#Boot_loader + - https://wiki.archlinux.org/title/Installation_guide#Example_layouts + - Bootloader + - https://man.archlinux.org/man/bootctl.1 + - `root@archiso ~ # bootctl --esp-path=/mnt/boot install` + - https://systemd.io/AUTOMATIC_BOOT_ASSESSMENT/ + - Type #2 EFI Unified Kernel Images + - https://systemd.io/BOOT_LOADER_SPECIFICATION/ + - https://wiki.archlinux.org/title/Installation_guide#Boot_loader + - https://github.com/nwildner/dracut-uefi-simple + - sysadmin + - https://github.com/aurae-runtime/auraed/tree/main/hack + - https://github.com/aurae-runtime/auraed/blob/main/hack/initramfs/mk-initramfs + - https://gist.github.com/pdxjohnny/a0dc3a58b4651dc3761bee65a198a80d#file-run-vm-sh-L125-L141 + - ssi-service + - https://github.com/TBD54566975/ssi-service/pull/111 + - https://edu.chainguard.dev/open-source/melange/getting-started-with-melange/ + - For packaging + - python + - https://github.com/pypa/get-pip + - TPM + - https://systemd.network/linuxx64.efi.stub.html#TPM2%20PCR%20Notes + - Secure Boot + - https://fedoraproject.org/wiki/Secureboot + - https://github.com/rhboot/pesign + - https://github.com/rhboot/shim \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0036/index.md b/docs/discussions/alice_engineering_comms/0036/index.md new file mode 100644 index 0000000000..92f189eb06 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0036/index.md @@ -0,0 +1,7 @@ +# 2022-09-24 Engineering Log + +- TODO + - [ ] @yukster to investigate creation of meetup + - Possible action items for meetup group + - Get folks together to talk about lasting solutions to technical debt (rather than revolving door reimplementation) + - Increasing awareness of technical debt incurred due to various business and architectural decisions. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0036/reply_0000.md b/docs/discussions/alice_engineering_comms/0036/reply_0000.md new file mode 100644 index 0000000000..5aa730d1ff --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0036/reply_0000.md @@ -0,0 +1,92 @@ +## 2022-09-24 @pdxjohnny Engineering Log + +- There are an infinite number of realities. We experience a subset in series when within the biological form. Time, time is the critical differentiator between this state of consciousness and others. The other states happen all at once, all the time. For whatever reason, if you find yourself in this reality, this one we call life. Know that you’ll only be here for a time. You may come back, but fundamentally, this life is your time. +- [Architecting Alice: OS DecentrAlice](https://github.com/intel/dffml/discussions/1406#discussioncomment-3720703) + +--- + + +```bash +ps +ps faux +ll +find +find /usr/ +apk search linux +apk search kernel +apk search systemd +apk search system +apk search go +apk add go +go install github.com/magefile/mage +go install github.com/magefile/mage@v1.14.0 +apk add git +git clone https://github.com/TBD54566975/ssi-service +cd ssi-service/ +mage build +pwd +go install github.com/magefile/mage +mage build +env +go install -h github.com/magefile/mage +go install -v github.com/magefile/mage +go install -vvvv github.com/magefile/mage +go install --debug github.com/magefile/mage +go install -debug github.com/magefile/mage +go install --help +go help install +ll ~/go/bin/ +ls -lAF ~/go/bin/ +export PATH=$PATH:$HOME/go/bin +alias herstory=history +herstory -a +cat ~/.bash_history +mage build +find . +file $(find .) +apk add file +file $(find .) +file $(find .) | grep bin +file $(find .) | grep -i bin +file $(find .) | grep -i exe +file $(find .) | grep -i EFI +file $(find .) +ll +ls -lAF +cat magefile.go +ls -lAF +ls build/ +ls -lAF cmd/ +file $(find .) | grep -v ssi +file $(find .) | grep ssi +git grep go\ bulid +git grep bulid +git grep bulid +l +pwd +grep -rn build . +cat build/Dockerfile +go build -tags jwx_es256k -o /docker-ssi-service ./cmd +herstory -a +ll +ls -lAF +ls -lAF cmd/ +/docker-ssi-service +go build -tags jwx_es256k netgo -o /docker-ssi-service ./cmd +go build -tags jwx_es256k -tags netgo -o /docker-ssi-service ./cmd +file /docker-ssi-service +lld /docker-ssi-service +apk add lld +apk add build-essential +apk add gcc +apk add binutils +apk add coreutils +lld /docker-ssi-service +pwd +cd +rm -rf ssi-service +curl -sfL https://github.com/TBD54566975/ssi-service/archive/refs/heads/main.tar.gz | tar xvz +apk add curl +curl -sfL https://github.com/TBD54566975/ssi-service/archive/refs/heads/main.tar.gz | tar xvz +herstory -a +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0037/index.md b/docs/discussions/alice_engineering_comms/0037/index.md new file mode 100644 index 0000000000..9b6711d512 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0037/index.md @@ -0,0 +1 @@ +# 2022-09-25 Engineering Log \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0037/reply_0000.md b/docs/discussions/alice_engineering_comms/0037/reply_0000.md new file mode 100644 index 0000000000..d41f4aac1d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0037/reply_0000.md @@ -0,0 +1,114 @@ +## 2022-09-25 @pdxjohnny Engineering Log + +- Architecting Alice: COPY Linux Kernel +- [Architecting Alice: OS DecentrAlice](https://github.com/intel/dffml/discussions/1406#discussioncomment-3720703) + +```console +$ cat > fedora.sh <<'EOF' +mount /dev/sda3 /mnt +mount /dev/sda1 /mnt/boot +swapon /dev/sda2 +mkdir -p /mnt/{proc,dev,sys} +mkdir -p /mnt/var/tmp +mkdir -p /mnt/fedora/var/tmp + +cat > /mnt/run-dracut.sh <<'LOL' +export PATH="${PATH}:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/lib/dracut/" +export KERNEL_VERSION="$(ls /lib/modules)" +bash -xp /usr/bin/dracut --uefi --kver ${KERNEL_VERSION} --kernel-cmdline "console=ttyS0 root=/dev/sda3" +LOL + +arch-chroot /mnt/fedora /bin/bash run-dracut.sh +EOF +$ bash fedora.sh +... ++ dinfo 'Executing: /usr/bin/dracut --uefi --kver 5.19.10-200.fc36.x86_64 --kernel-cmdline console=ttyS0' ++ set +x +bash-5.1# echo $? +0 +bash-5.1# lsblk +NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINTS +fd0 2:0 1 4K 0 disk +loop0 7:0 0 683.2M 1 loop +sda 8:0 0 20G 0 disk +├─sda1 8:1 0 260M 0 part +├─sda2 8:2 0 10G 0 part [SWAP] +└─sda3 8:3 0 9.8G 0 part +sr0 11:0 1 1024M 0 rom +bash-5.1# find /boot/ +/boot/ +/boot/System.map-5.19.10-200.fc36.x86_64 +/boot/.vmlinuz-5.19.10-200.fc36.x86_64.hmac +/boot/vmlinuz-5.19.10-200.fc36.x86_64 +/boot/symvers-5.19.10-200.fc36.x86_64.gz +/boot/efi +/boot/efi/EFI +/boot/efi/EFI/fedora +/boot/efi/EFI/Linux +/boot/efi/EFI/Linux/linux-5.19.10-200.fc36.x86_64-d1a1c3d381b9405ab46417e3535ef1be.efi +/boot/grub2 +/boot/initramfs-5.19.10-200.fc36.x86_64.img +/boot/loader +/boot/loader/entries +/boot/loader/entries/d1a1c3d381b9405ab46417e3535ef1be-5.19.10-200.fc36.x86_64.conf +/boot/config-5.19.10-200.fc36.x86_64 +bash-5.1# +exit +[root@archiso ~]# bash fedora.shc +[root@archiso ~]# ll /mnt/boot/ +bash: ll: command not found +[root@archiso ~]# find !$ +find /mnt/boot/ +/mnt/boot/ +/mnt/boot/NvVars +[root@archiso ~]# bootctl --esp-path=/mnt/boot install +Created "/mnt/boot/EFI". +Created "/mnt/boot/EFI/systemd". +Created "/mnt/boot/EFI/BOOT". +Created "/mnt/boot/loader". +Created "/mnt/boot/loader/entries". +Created "/mnt/boot/EFI/Linux". +Copied "/usr/lib/systemd/boot/efi/systemd-bootx64.efi" to "/mnt/boot/EFI/systemd/systemd-bootx64.efi". +Copied "/usr/lib/systemd/boot/efi/systemd-bootx64.efi" to "/mnt/boot/EFI/BOOT/BOOTX64.EFI". +Random seed file /mnt/boot/loader/random-seed successfully written (32 bytes). +Not installing system token, since we are running in a virtualized environment. +Created EFI boot entry "Linux Boot Manager". +[root@archiso ~]# find /mnt/boot/ +/mnt/boot/ +/mnt/boot/NvVars +/mnt/boot/EFI +/mnt/boot/EFI/systemd +/mnt/boot/EFI/systemd/systemd-bootx64.efi +/mnt/boot/EFI/BOOT +/mnt/boot/EFI/BOOT/BOOTX64.EFI +/mnt/boot/EFI/Linux +/mnt/boot/loader +/mnt/boot/loader/entries +/mnt/boot/loader/loader.conf +/mnt/boot/loader/random-seed +/mnt/boot/loader/entries.srel +[root@archiso ~]# for file in $(find /mnt/fedora/boot/); do cp -v $file $(echo $file | sed -e 's/fedora//' -e 's/efi\/EFI/EFI/'); done +[root@archiso ~]# diff -y <(find /mnt/boot | sort) <(find /mnt/fedora/boot | sed -e 's/fedora\///' -e 's/efi\/EFI/EFI/' | sort) +/mnt/boot /mnt/boot +/mnt/boot/.vmlinuz-5.19.10-200.fc36.x86_64.hmac /mnt/boot/.vmlinuz-5.19.10-200.fc36.x86_64.hmac +/mnt/boot/EFI /mnt/boot/EFI +/mnt/boot/EFI/BOOT < +/mnt/boot/EFI/BOOT/BOOTX64.EFI < +/mnt/boot/EFI/Linux /mnt/boot/EFI/Linux +/mnt/boot/EFI/Linux/linux-5.19.10-200.fc36.x86_64-d1a1c3d381b /mnt/boot/EFI/Linux/linux-5.19.10-200.fc36.x86_64-d1a1c3d381b +/mnt/boot/EFI/systemd | /mnt/boot/EFI/fedora +/mnt/boot/EFI/systemd/systemd-bootx64.efi < +/mnt/boot/NvVars < +/mnt/boot/System.map-5.19.10-200.fc36.x86_64 /mnt/boot/System.map-5.19.10-200.fc36.x86_64 +/mnt/boot/config-5.19.10-200.fc36.x86_64 /mnt/boot/config-5.19.10-200.fc36.x86_64 + > /mnt/boot/efi + > /mnt/boot/grub2 +/mnt/boot/initramfs-5.19.10-200.fc36.x86_64.img /mnt/boot/initramfs-5.19.10-200.fc36.x86_64.img +/mnt/boot/loader /mnt/boot/loader +/mnt/boot/loader/entries /mnt/boot/loader/entries +/mnt/boot/loader/entries.srel < +/mnt/boot/loader/entries/d1a1c3d381b9405ab46417e3535ef1be-5.1 /mnt/boot/loader/entries/d1a1c3d381b9405ab46417e3535ef1be-5.1 +/mnt/boot/loader/loader.conf | /mnt/boot/symvers-5.19.10-200.fc36.x86_64.gz +/mnt/boot/loader/random-seed < +/mnt/boot/vmlinuz-5.19.10-200.fc36.x86_64 /mnt/boot/vmlinuz-5.19.10-200.fc36.x86_64 +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0038/index.md b/docs/discussions/alice_engineering_comms/0038/index.md new file mode 100644 index 0000000000..74d1ed0c7a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0038/index.md @@ -0,0 +1 @@ +# 2022-09-26 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0038/reply_0000.md b/docs/discussions/alice_engineering_comms/0038/reply_0000.md new file mode 100644 index 0000000000..ece6a155fc --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0038/reply_0000.md @@ -0,0 +1,226 @@ +## 2022-09-26 @pdxjohnny Engineering Logs + +- Alice + - State of the art updated to 98335d941116e76bbf4e07422adc2b5061e47934 + - Overlay of CI/CD library detection example: https://github.com/intel/dffml/commit/90d5c52f4dd64f046a2e2469d001e32ec2d53966 + +Install Alice: https://github.com/intel/dffml/tree/alice/entities/alice/ + +```console +$ python -m venv .venv +$ . .venv/bin/activate +$ python -m pip install -U pip setuptools wheel +$ export ALICE_STATE_OF_ART=98335d941116e76bbf4e07422adc2b5061e47934 +$ python -m pip install \ + "[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=dffml)" \ + "[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-feature-git&subdirectory=feature/git](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=dffml-feature-git&subdirectory=feature/git)" \ + "[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=shouldi&subdirectory=examples/shouldi](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=shouldi&subdirectory=examples/shouldi)" \ + "[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-config-yaml&subdirectory=configloader/yaml](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=dffml-config-yaml&subdirectory=configloader/yaml)" \ + "[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-operations-innersource&subdirectory=operations/innersource](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=dffml-operations-innersource&subdirectory=operations/innersource)" \ + "[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=alice&subdirectory=entities/alice](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=alice&subdirectory=entities/alice)" +``` + +Install this overlay (from this commit in this example): + +```console +$ python -m pip install --force-reinstall --upgrade "git+https://github.com/intel/dffml@d2a38d47445241fc99d26bc2a51184caa88bd033#subdirectory=entities/alice" +``` + +Collect metrics on a repo using `alice shouldi contribute`: + +```console +$ alice -log debug shouldi contribute -keys https://github.com/pdxjohnny/httptest 2>&1 | tee .alice.shouldi.contribute.log.$(date "+%4Y-%m-%d-%H-%M").txt +$ alice -log debug shouldi contribute -record-def GitHubRepoID -keys 149512216 2>&1 | tee .alice.shouldi.contribute.log.$(date "+%4Y-%m-%d-%H-%M").txt +$ python -c 'import yaml, json, sys; print(yaml.dump(json.load(sys.stdin)))' < .tools/open-architecture/innersource/repos.json +untagged: + https://github.com/aliceoa/example-github-action: + features: + alice.shouldi.contribute.cicd:cicd_action_library: + result: true + group_by: + ActionYAMLFileWorkflowUnixStylePath: + - my_action_name/action.yml +``` + +- Generating JSON schema + - https://pydantic-docs.helpmanual.io/usage/schema/ + - https://pydantic-docs.helpmanual.io/install/ + - https://pydantic-docs.helpmanual.io/usage/model_config/ + - https://pydantic-docs.helpmanual.io/usage/schema/#schema-customization + - Initial commit: 168a3e26c62d7e0c8dd92b1761ec5fad273fb9c6 + - Added `$schema` to make output schema a valid Manifest schema per ADR requirements + - https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md +- KERI + - https://keri.one + - https://humancolossus.foundation/blog/thinking-of-did-keri-on/keri-resources/ +- References + - https://open-music.org/ + - https://github.com/fzipp/gocyclo + - > Calculate cyclomatic complexities of functions in Go source code. + +```console +$ curl -sfL https://github.com/intel/dffml/ | grep octolytics-dimension-repository_id + +coder@coder-john-s-andersen-alice:/src/dffml$ curl -sfL https://github.com/intel/dffml/ | grep octolytics-dimension-repository_id | sed -e 's/octolytics-dimension-repository_id" content="//' + +coder@coder-john-s-andersen-alice:/src/dffml$ curl -sfL https://github.com/intel/dffml/ | grep octolytics-dimension-repository_id | sed -e 's/.*octolytics-dimension-repository_id" content="//' +149512216" /> +coder@coder-john-s-andersen-alice:/src/dffml$ curl -sfL https://github.com/intel/dffml/ | grep octolytics-dimension-repository_id | sed -e 's/.*octolytics-dimension-repository_id" content="//' -e 's/".*//' +149512216 +coder@coder-john-s-andersen-alice:/src/dffml $ gh api https://api.github.com/repositories/149512216 | jq -r '.clone_url' +https://github.com/intel/dffml.git +``` + +Added GitHubRepoID to URL lookup via https://github.com/intel/dffml/commit/4d64f011ccdee8882adbc4b7447953c4416ceb64 + +Run the metric collection + +```console +coder@coder-john-s-andersen-alice:/src/dffml$ alice -log debug shouldi contribute -record-def GitHubRepoID -keys 149512216 +``` + +Convert to YAML for easy reading + +```console +$ python -c 'import yaml, json, sys; print(yaml.dump(json.load(sys.stdin)))' < .tools/open-architecture/innersource/repos.json +untagged: + https://github.com/trekhleb/javascript-algorithms: + extra: {} + features: + dffml_operations_innersource.operations:badge_maintained: + result: https://img.shields.io/badge/Maintainance-Active-green + dffml_operations_innersource.operations:badge_unmaintained: + result: https://img.shields.io/badge/Maintainance-Inactive-red + group_by: + GitHubActionsWorkflowUnixStylePath: + - .github/workflows/CI.yml + author_line_count: + - Oleksii Trekhleb: 370 + bool: + - true + commit_shas: + - d3c0ee6f7af3fce4a3a2bdc1c5be36d7c2d9793a + release_within_period: + - false + key: https://github.com/trekhleb/javascript-algorithms + last_updated: '2022-09-26T15:13:00Z' +``` + +- Accidentally force pushed + - Enabled branch protected on the `alice` branch + - Went to PR and looked for "forced pushed" in logs + - Grabbed the commit and found the compare because we can download the patchset but it won't let us create a branch off it that we could tell + - https://github.com/intel/dffml/compare/alice...0c4b8191b13465980ced3fd1ddfbea30af3d1104.patch + - Downloaded with curl + - `curl -sfLO https://github.com/intel/dffml/compare/alice...0c4b8191b13465980ced3fd1ddfbea30af3d1104.patch` + - Removed the first patch which we rebase squashed other commits into + - `vim alice...0c4b8191b13465980ced3fd1ddfbea30af3d1104.patch` + - Apply patches (there were 15 after removing the collecting Jenkins patch) + - `git am < alice...0c4b8191b13465980ced3fd1ddfbea30af3d1104.patch` + +```yaml + check_if_valid_git_repository_URL: + inputs: + URL: + - dffml_operations_innersource.cli:github_repo_id_to_clone_url: result + - seed + cleanup_git_repo: + inputs: + repo: + - clone_git_repo: repo + clone_git_repo: + conditions: + - check_if_valid_git_repository_URL: valid + inputs: + URL: + - dffml_operations_innersource.cli:github_repo_id_to_clone_url: result + - seed + ssh_key: + - seed + count_authors: + inputs: + author_lines: + - git_repo_author_lines_for_dates: author_lines + dffml_feature_git.feature.operations:git_grep: + inputs: + repo: + - clone_git_repo: repo + search: + - seed + dffml_operations_innersource.cli:ensure_tokei: + inputs: {} + dffml_operations_innersource.cli:github_repo_id_to_clone_url: + inputs: + repo_id: + - seed +``` + +- Ah, forgot to call `COLLECTOR_DATAFLOW.update_by_origin()` + - We always forget about this, we should probably call `dataflow.update_by_origin()` by default on orchestrator context entry. +- In progress on auto creation of JSON schema from single object or list of example objects + +```diff +diff --git a/configloader/jsonschema/tests/test_config.py b/configloader/jsonschema/tests/test_config.py +index ea4852862..2a0b9ffa1 100644 +--- a/configloader/jsonschema/tests/test_config.py ++++ b/configloader/jsonschema/tests/test_config.py +@@ -137,4 +137,6 @@ class TestConfig(AsyncTestCase): + async with configloader() as ctx: + original = {"Test": ["dict"]} + reloaded = await ctx.loadb(await ctx.dumpb(original)) ++ from pprint import pprint ++ pprint(reloaded) + self.assertEqual(original, TEST_0_SCHEMA_SHOULD_BE) +``` + +```console +$ python -m unittest discover -v +test_0_dumpb_loadb (tests.test_config.TestConfig) ... {'$schema': 'https://intel.github.io/dffml/manifest-format-name.0.0.2.schema.json', + 'definitions': {'FooBar': {'properties': {'count': {'title': 'Count', + 'type': 'integer'}, + 'size': {'title': 'Size', + 'type': 'number'}}, + 'required': ['count'], + 'title': 'FooBar', + 'type': 'object'}, + 'Gender': {'description': 'An enumeration.', + 'enum': ['male', 'female', 'other', 'not_given'], + 'title': 'Gender', + 'type': 'string'}}, + 'description': 'This is the description of the main model', + 'properties': {'Gender': {'$ref': '#/definitions/Gender'}, + 'foo_bar': {'$ref': '#/definitions/FooBar'}, + 'snap': {'default': 42, + 'description': 'this is the value of snap', + 'exclusiveMaximum': 50, + 'exclusiveMinimum': 30, + 'title': 'The Snap', + 'type': 'integer'}}, + 'required': ['foo_bar'], + 'title': 'Main', + 'type': 'object'} +FAIL + +====================================================================== +FAIL: test_0_dumpb_loadb (tests.test_config.TestConfig) +---------------------------------------------------------------------- +Traceback (most recent call last): + File "/src/dffml/dffml/util/asynctestcase.py", line 115, in run_it + result = self.loop.run_until_complete(coro(*args, **kwargs)) + File "/.pyenv/versions/3.9.13/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete + return future.result() + File "/src/dffml/configloader/jsonschema/tests/test_config.py", line 142, in test_0_dumpb_loadb + self.assertEqual(original, TEST_0_SCHEMA_SHOULD_BE) +AssertionError: {'Test': ['dict']} != {'title': 'Main', 'description': 'This is t[665 chars]g'}}} +Diff is 1276 characters long. Set self.maxDiff to None to see it. + +---------------------------------------------------------------------- +Ran 1 test in 0.005s + +FAILED (failures=1) +``` + +- TODO + - [ ] Add option for output configloader similar to `-log` for all CLI commands. + - [ ] Enables serialization of returned objects from `CMD.run()` methods into to arbitrary formats. + - [ ] `JSONSchemaConfigLoaderConfig.multi: bool` could allow us to interpret the input as a set of inputs which the generated schema should conform to all. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0038/reply_0001.md b/docs/discussions/alice_engineering_comms/0038/reply_0001.md new file mode 100644 index 0000000000..68fefde696 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0038/reply_0001.md @@ -0,0 +1,15 @@ +# How Does W3C Work? + +- W3C groups are chartered for a set amount of time +- https://w3c.github.io/did-use-cases/ + - WG will be focusing on interoperability + - Ensure DID methods interoperate + - Will try to define what does interoperability mean + - Would be nice to have a schema for a did + - Example: PKI Cert DID + - Structure around how application would go about solving an authentication or authorization challenge + - Could be made to work with zero knowledge proofs or other arbitrary methods + - Point is largely to ensure you don't have to use centralized PKI +- Vol 3: Politics + - Sometimes folks object to continuing a WG charter on pollical or philosophical groups + - WG members sometimes raise concern of opponents charters concerns on the grounds they want to preserve currently advantageous positions held due to lack of standards. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0039/index.md b/docs/discussions/alice_engineering_comms/0039/index.md new file mode 100644 index 0000000000..bcd80bc827 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0039/index.md @@ -0,0 +1,10 @@ +# 2022-09-27 Engineering Logs + +- SPDX 2.3 + - https://www.chainguard.dev/unchained/whats-new-in-spdx-2-3 +- DX + - https://kenneth.io/post/developer-experience-infrastructure-dxi +- IPVM + - https://github.com/ipvm-wg/spec/discussions/3 + - https://github.com/ipvm-wg/spec/discussions/7 + - https://fission.codes/blog/ipfs-thing-breaking-down-ipvm/ \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0039/reply_0000.md b/docs/discussions/alice_engineering_comms/0039/reply_0000.md new file mode 100644 index 0000000000..3f0ddc7b5f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0039/reply_0000.md @@ -0,0 +1,66 @@ +## 2022-09-27 @pdxjohnny Engineering Logs + +- Install plugin with subdirectory from commit from git + - `python -m venv .venv` + - `source .venv` + - `python -m pip install --upgrade setuptools pip wheel` + - `python -m pip install --upgrade "git+https://github.com/intel/dffml@17ccb5b76f261d2725a64528e25669ef97920d70#subdirectory=entities/alice"` + - pypi proxy is how we enable manifest BOM component swap out for downstream validation within 2nd party CI setup (workaround for dependency links issue) + - References + - https://github.com/intel/dffml/pull/1207 + - https://github.com/intel/dffml/pull/1061 + - https://github.com/intel/dffml/discussions/1406#discussioncomment-3676224 + +``` +$ dffml version +dffml 0.4.0 /src/dffml/dffml 5c89b6780 (dirty git repo) +dffml-config-yaml 0.1.0 /src/dffml/configloader/yaml/dffml_config_yaml 5c89b6780 (dirty git repo) +dffml-config-image not installed +dffml-config-jsonschema 0.0.1 /src/dffml/configloader/jsonschema/dffml_config_jsonschema 5c89b6780 (dirty git repo) +dffml-model-scratch not installed +dffml-model-scikit not installed +dffml-model-tensorflow not installed +dffml-model-tensorflow-hub not installed +dffml-model-vowpalWabbit not installed +dffml-model-xgboost not installed +dffml-model-pytorch not installed +dffml-model-spacy not installed +dffml-model-daal4py not installed +dffml-model-autosklearn not installed +dffml-feature-git 0.3.0 /src/dffml/feature/git/dffml_feature_git 5c89b6780 (dirty git repo) +dffml-feature-auth not installed +dffml-operations-binsec not installed +dffml-operations-data not installed +dffml-operations-deploy not installed +dffml-operations-image not installed +dffml-operations-nlp not installed +dffml-operations-innersource 0.0.1 /src/dffml/operations/innersource/dffml_operations_innersource 5c89b6780 (dirty git repo) +dffml-service-http not installed +dffml-source-mysql not installed +``` + +- Encourage and coordinate collaborative documentation of strategy and implementation as living documentation to help community communicate amongst itself and facilitate sync with potential users / other communities / aligned workstreams. +- SCITT + - https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md + - https://github.com/ietf-scitt/use-cases/pull/18 +- Stream of Consciousness + - Decentralized Web Node and Self-Sovereign Identity Service + - https://github.com/TBD54566975/ssi-service/tree/main/sip/sips/sip4 + - https://forums.tbd.website/t/sip-4-discussion-dwn-message-processing/137 + - https://github.com/TBD54566975/ssi-service/pull/113 + - Gabe approved 17 minutes ago + - Chaos smiles on us again + - https://github.com/TBD54566975/ssi-service/blob/3869b8ef2808210201ae6c43e2e0956a85950fc6/pkg/dwn/dwn_test.go#L22-L58 + - https://identity.foundation/credential-manifest/ + - > For User Agents (e.g. wallets) and other service that wish to engage with Issuers to acquire credentials, there must exist a mechanism for assessing what inputs are required from a Subject to process a request for credential(s) issuance. The Credential Manifest is a common data format for describing the inputs a Subject must provide to an Issuer for subsequent evaluation and issuance of the credential(s) indicated in the Credential Manifest. + > + > Credential Manifests do not themselves define the contents of the output credential(s), the process the Issuer uses to evaluate the submitted inputs, or the protocol Issuers, Subjects, and their User Agents rely on to negotiate credential issuance. + > + > ![image](https://user-images.githubusercontent.com/5950433/192642680-627f9da6-ebb1-45b6-9872-7202e8b3fcaf.png) + - In our distributed compute setup, credential issuance is the execution (which we had been looking at confirming the trades of via the tbDEX protocol, no work has been done on that front recently from DFFML side) + - What they refer to as a "Credential Manifest" is similar to what we refer to as an "Manifest Instance". + - https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md + - `SpecVersion` has all the properties we require of Manifests (see `$schema`) so we can indeed classify a "Credential Manifest" as a Manifest. + - Alignment looking strong! + - > ![image](https://user-images.githubusercontent.com/5950433/192644284-3cf55d65-ca00-4c25-98fa-babf1bfd945d.png) + - https://github.com/TBD54566975/ssi-service/pull/113/files#diff-7926652f7b7153343e273a0a72f87cb0cdf4c3063ec912cdb95dc541a8f2785dR62 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0039/reply_0001.md b/docs/discussions/alice_engineering_comms/0039/reply_0001.md new file mode 100644 index 0000000000..1dee280c89 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0039/reply_0001.md @@ -0,0 +1,45 @@ +## 2022-9-27: Day 1: Innovation Day 1 Keynote LIVE WEBCAST + +> Notes from webcast + +![image](https://user-images.githubusercontent.com/5950433/192823017-a3ec1a2d-4cd8-466b-a82b-71a977949943.png) + +![image](https://user-images.githubusercontent.com/5950433/192618679-43ecd987-def5-4799-90f6-9dc8f4d7d877.png) + +- Webcast: https://twitter.com/intel/status/1574492026988642317 +- Pat quotes + - Committed to a strategy of building a more balanced and resilient supply chain for the world + - We are torridly ("full of intense emotion...": https://en.wiktionary.org/wiki/torrid) moving to the future. + - We will continue to be the stewards of Moore's law into the future + - Intel, be my supply chain manager + - Tech for good impact + - Commitment to being open + - Our collective potential as an industry is unleashed when we enable openness, choice, and trust + - Our objective is that developers whether software or hardware you see the future, + - and our job at Intel is to open that future up to you, + - working together on open frameworks that you can trust. + - I'm excited we have the opportunity to come together to learn, grow, build, challenge and help each other, + - and together we've taken a peak into the future, one that we will create together. +- https://cloud.intel.com + - Developer cloud +- Greg: "Software, the soul of the machine" + - Software defined, silicon enhanced + - Vibrant chipplet ecosystem + - UCIe + - Universal Chipplet Interconnect Express +- Champion of open marketplace +- auto optimization of xeon speedup 10x +- https://geti.intel.com/ (end of year) +- text to image demo + - using latent diffusion + - https://twitter.com/pdxjohnny/status/1572438573336662017?s=20&t=6rHO8ShUU0eIffdvcJzLPw + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0000_introduction.md + - Looks like we're accelerating +- Alignment + - "Our objective is that developers whether software or hardware you see the future, and our job at Intel is to open that future up to you" [Pat] + - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0036/reply_0013.md + - "Software, the soul of the machine" [Greg] + - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#entity-analysis-trinity +- TODO + - [x] @pdxjohnny Reach out to Ria Cheruvu to see if she is interested in collaborating on Alice's ethics or other aspects. + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_preface.md#volume-5-alices-adventures-in-wonderland \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0040/index.md b/docs/discussions/alice_engineering_comms/0040/index.md new file mode 100644 index 0000000000..c81b87c891 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0040/index.md @@ -0,0 +1,25 @@ +# 2022-09-28 Engineering Logs + +- Self-Sovereign Identity Service + - https://github.com/TBD54566975/ssi-service/tree/main/sip +- https://lu.ma/ipvm + - Tuesday, October 11, 2022 9:00 AM-10:00 AM + - > ​This call is open to all, but is focused on implementers, following the IETF's rough "consensus and running code" ethos. + > + > ​The IPVM is an effort to add content-addressed computation to IPFS. The requires specifying calling convention, distributed scheduling, session receipts, mobile computing, and auto-upgradable IPFS internals. + > + > - ​Links + > - ​[Community Calls](https://github.com/ipvm-wg/spec/discussions/categories/community-call) + > - ​[GitHub Org](https://github.com/ipvm-wg) + > - ​[Discord Channel](https://discord.gg/eudkhw9NQJ) + > - ​[IPFS þing '22 Slides](https://noti.st/expede/oq0ULd/ipvm-interplanetary-vm) + > + > > ​Wasm modules, their arguments, intermediate states, their outputs, and managed effects can be described as IPLD graphs. IPVM is a strategy to support generalized deterministic computation in a serverless style on top of IPFS with optional side-channel matchmaking on Filecoin, and extend the same benefits of shared data blocks to computation. +- GitHub Actions for downstream validation of 2nd party plugins. + - Issue: Need container images running for some (`dffml-source-mysql` integration tests). + - Use DERP to join running actions jobs. + - Use privilege separation of two user accounts. + - Credit to Matt for this idea came up with trying to make API token permission delegation more granular than what is currently supported, same role based copy user scheme. + - Everything is terraform templates (coder, k8s), dockerfiles and actions workflows (coder setup-ssh and then do port forwarding, now you can spin up anything). + - Those can all be described as dataflows and synthesized to + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_forward.md#supply-chain-security \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0040/reply_0000.md b/docs/discussions/alice_engineering_comms/0040/reply_0000.md new file mode 100644 index 0000000000..c05d070e5c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0040/reply_0000.md @@ -0,0 +1,65 @@ +## 2022-09-28 @pdxjohnny Engineering Logs + +- Terraform + - https://registry.terraform.io/providers/hashicorp/http/latest/docs/data-sources/http + - https://registry.terraform.io/providers/hashicorp/kubernetes/latest/docs +- VSCode + - https://sourcegraph.com/search?q=repo:%5Egithub%5C.com/microsoft/.*+remotePlatform&patternType=standard + - Goal: DERP remote connect to ssh coder + - Tabled this for later + - https://github.com/coder/coder/search?q=derp + - They added support for a config option! + - https://github.com/coder/coder/pull/4030 + - https://github.com/coder/coder/blob/7e54413d3b39d8da8cd404190739a7de35f467de/docs/networking.md + - Tailscale added official docs on running DERP servers! + - https://tailscale.com/kb/1118/custom-derp-servers/#why-run-your-own-derp-server + - https://github.com/coder/coder/blob/7e54413d3b39d8da8cd404190739a7de35f467de/docs/networking/port-forwarding.md + - https://github.dev/intel/dffml + - https://github.com/microsoft/vscode/blob/236adc221bb31701db4c2a36ffed544653b26311/src/vs/workbench/contrib/welcomeGettingStarted/browser/gettingStarted.contribution.ts#L253-L285 + - https://github.com/microsoft/vscode-docs/blob/b0cc336a950effd3d5c012900a6ec1ba613fc8fb/docs/remote/troubleshooting.md + - https://sourcegraph.com/search?q=context:global+repo:%5Egithub%5C.com/microsoft/.*+showLoginTerminal&patternType=standard + - https://github.com/microsoft/vscode-cpptools/blob/ebb24763bd1143d9177a5fa6a7b70ade8c9f05ab/Extension/src/SSH/sshCommandRunner.ts + - Seems like a vendored version of what we are looking for + - https://github.com/microsoft/vscode/blob/0c22a33a9d670a84309447b36abdbd8c04ee6219/src/vs/workbench/services/remote/common/remoteAgentService.ts#L20 + - https://github.com/microsoft/vscode/blob/b7d5b65a13299083e92bca91be8fa1289e95d5c1/src/vs/workbench/services/remote/browser/remoteAgentService.ts#L22 + - https://github.com/microsoft/vscode/blob/b7d5b65a13299083e92bca91be8fa1289e95d5c1/src/vs/platform/remote/browser/browserSocketFactory.ts#L268 +- GitHub Actions for downstream validation of 2nd party plugins. + - https://docs.github.com/en/actions/using-jobs/running-jobs-in-a-container + - https://docs.github.com/en/actions/using-containerized-services/about-service-containers + - docs: tutorials: rolling alice: forward: security: supply chain: Mention tie to distributed compute + - https://github.com/intel/dffml/commit/e9af134d07f104e6db89ac872a8c2249198261da + - https://twitter.com/pdxjohnny/status/1575152364440657920 + - https://twitter.com/pdxjohnny/status/1574974594863472640 +- Open Architecture + - Threat Modeling + - [FIRST](https://www.first.org/cvss/v2/team) + - [Open SSF](https://openssf.org/) + - https://openssf.org/oss-security-mobilization-plan/ + - Integration points + - https://github.com/ossf/scorecard + - https://github.com/ossf/criticality_score + - https://github.com/ossf/osv-schema + - Manual ask first, do you do threat modeling? + - Eventually attestations / assertions + - Get involved with risk assessment work in OpenSSF happening. + - Lot's happening in ID security threats, stay engaged there. + - Risk assessment work might land here. +- Upstream communities which may be good places to show up and participate + - OpenSSF Identifying Security Threats (still) +- Similar + - https://github.com/ossf/fuzz-introspector/blob/main/doc/Architecture.md + - https://github.com/chaoss/wg-risk + - https://github.com/chaoss/wg-risk/blob/main/focus-areas/dependency-risk-assessment/upstream-code-dependencies.md +- CHAOSS Augur + - https://github.com/chaoss/augur/blob/main/docker-compose.yml + - https://github.com/chaoss/augur/blob/main/scripts/docker/docker-setup-database.sh + - https://github.com/chaoss/augur/pkgs/container/augur_backend + - https://oss-augur.readthedocs.io/en/main/getting-started/installation.html + - https://oss-augur.readthedocs.io/en/main/development-guide/workers/creating_a_new_worker.html + +![initial-sketch-of-abstract-compute-architecture](https://user-images.githubusercontent.com/5950433/196192835-3a6ddb72-6a52-4043-bb6c-348382f2fcac.jpeg) + +- TODO + - [ ] `CITATIONS.cff` demo + - https://github.com/intel/dffml/discussions/1406#discussioncomment-3510908 + - https://securitytxt.org/ RFC 9116 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0040/reply_0001.md b/docs/discussions/alice_engineering_comms/0040/reply_0001.md new file mode 100644 index 0000000000..406fc890ea --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0040/reply_0001.md @@ -0,0 +1,3 @@ +## Quotes + +- “I thrive in Chaos. Its beyond Chaos” [Alice] \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0041/index.md b/docs/discussions/alice_engineering_comms/0041/index.md new file mode 100644 index 0000000000..b84ec33b80 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0041/index.md @@ -0,0 +1 @@ +# 2022-09-29 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0041/reply_0000.md b/docs/discussions/alice_engineering_comms/0041/reply_0000.md new file mode 100644 index 0000000000..74e13ef222 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0041/reply_0000.md @@ -0,0 +1,237 @@ +## 2022-09-29 @pdxjohnny Engineering Logs + +- SPIFFE + - https://github.com/spiffe/spire/issues/1003 +- rekor + - https://github.com/sigstore/rekor/blob/main/docker-compose.yml +- Open Policy Agent + - https://github.com/transmute-industries/did-eqt/blob/main/docs/did-eqt-opa-primer.md +- Great org README + - https://github.com/transmute-industries +- Verifiable Data TypeScript Library + - https://github.com/transmute-industries/verifiable-data +- Sidetree + - https://identity.foundation/sidetree/spec/ + - > ![sidetree-arch](https://identity.foundation/sidetree/spec/diagrams/sidetree-system.svg) + > + > #### [DID State Patches](https://identity.foundation/sidetree/spec/#did-state-patches) + > Sidetree defines a delta-based [Conflict-Free Replicated Data Type](https://en.wikipedia.org/wiki/Conflict-free_replicated_data_type) system, wherein the metadata in a Sidetree-based implementation is controlled by the cryptographic PKI material of individual entities in the system, represented by DIDs. While the most common form of state associated with the DIDs in a Sidetree-based implementation is a [DID Document](https://w3c.github.io/did-core/), Sidetree can be used to maintain any type of DID-associated state. + > + > Sidetree specifies a general format for patching the state associated with a DID, called Patch Actions, which define how to deterministic mutate a DID’s associated state. Sidetree further specifies a standard set of Patch Actions (below) implementers MAY use to facilitate DID state patching within their implementations. Support of the standard set of Patch Actions defined herein IS NOT required, but implementers MUST use the Patch Action format for defining patch mechanisms within their implementation. The general Patch Action format is defined as follows: + > - `add-public-keys` + > - `remove-public-keys` + > - `add-services` + > - `remove-services` + > - `ietf-json-patch` + > + > #### [Proof of Fee](https://identity.foundation/sidetree/spec/#proof-of-fee) + > + > [NOTE](https://identity.foundation/sidetree/spec/#note-6) This section is non-normative + > + > Sidetree implementers MAY choose to implement protective mechanisms designed to strengthen a Sidetree network against low-cost spurious operations. These mechanisms are primarily designed for open, permissionless implementations utilizing public blockchains that feature native crypto-economic systems. +- GitHub Actions + - https://docs.github.com/en/actions/using-jobs/running-jobs-in-a-container + - https://docs.github.com/en/actions/using-containerized-services/about-service-containers + - https://github.com/jenkinsci/custom-war-packager/issues/173 +- Misc. diffs lying around + +```diff +diff --git a/dffml/df/base.py b/dffml/df/base.py +index 4f84c1c7c..1303e41c4 100644 +--- a/dffml/df/base.py ++++ b/dffml/df/base.py +@@ -15,11 +15,12 @@ from typing import ( + Union, + Optional, + Set, ++ ContextManager, + ) + from dataclasses import dataclass, is_dataclass, replace + from contextlib import asynccontextmanager + +-from .exceptions import NotOpImp ++from .exceptions import NotOpImp, RetryOperationException + from .types import ( + Operation, + Input, +@@ -94,6 +95,7 @@ class OperationImplementationContext(BaseDataFlowObjectContext): + self.parent = parent + self.ctx = ctx + self.octx = octx ++ self.op_retries = None + + @property + def config(self): +@@ -102,6 +104,31 @@ class OperationImplementationContext(BaseDataFlowObjectContext): + """ + return self.parent.config + ++ ++ @asynccontextmanager ++ async def raiseretry(self, retries: int) -> ContextManager[None]: ++ """ ++ Use this context manager to have the orchestrator call the operation's ++ ``run()`` method multiple times within the same ++ OperationImplementationContext entry. ++ ++ Useful for ++ ++ TODO ++ ++ - Backoff ++ ++ >>> def myop(self): ++ ... with self.raiseretry(5): ++ ... if self.op_current_retry < 4: ++ ... raise Exception() ++ """ ++ try: ++ yield ++ except Exception as error: ++ raise RetryOperationException(retries) from error ++ ++ + @abc.abstractmethod + async def run(self, inputs: Dict[str, Any]) -> Union[bool, Dict[str, Any]]: + """ +diff --git a/dffml/df/exceptions.py b/dffml/df/exceptions.py +index 3ec596d6c..06606a3f8 100644 +--- a/dffml/df/exceptions.py ++++ b/dffml/df/exceptions.py +@@ -32,3 +32,8 @@ class ValidatorMissing(Exception): + + class MultipleAncestorsFoundError(NotImplementedError): + pass ++ ++ ++class RetryOperationException(Exception): ++ def __init__(self, retires: int) -> None: ++ self.retires = retires +diff --git a/dffml/df/memory.py b/dffml/df/memory.py +index f6f15f5a0..740fc7614 100644 +--- a/dffml/df/memory.py ++++ b/dffml/df/memory.py +@@ -27,6 +27,7 @@ from .exceptions import ( + ValidatorMissing, + MultipleAncestorsFoundError, + NoInputsWithDefinitionInContext, ++ RetryOperationException, + ) + from .types import ( + Input, +@@ -39,6 +40,7 @@ from .types import ( + from .base import ( + OperationException, + OperationImplementation, ++ OperationImplementationContext, + FailedToLoadOperationImplementation, + BaseDataFlowObject, + BaseDataFlowObjectContext, +@@ -1190,6 +1192,7 @@ class MemoryOperationImplementationNetworkContext( + ctx: BaseInputSetContext, + octx: BaseOrchestratorContext, + operation: Operation, ++ opctx: OperationImplementationContext, + inputs: Dict[str, Any], + ) -> Union[bool, Dict[str, Any]]: + """ +@@ -1198,9 +1201,7 @@ class MemoryOperationImplementationNetworkContext( + # Check that our network contains the operation + await self.ensure_contains(operation) + # Create an opimp context and run the operation +- async with self.operations[operation.instance_name]( +- ctx, octx +- ) as opctx: ++ with contextlib.nullcontext(): + self.logger.debug("---") + self.logger.debug( + "%s Stage: %s: %s", +@@ -1251,22 +1252,28 @@ class MemoryOperationImplementationNetworkContext( + """ + Run an operation in our network. + """ +- if not operation.retry: +- return await self.run_no_retry(ctx, octx, operation, inputs) +- for retry in range(0, operation.retry): +- try: +- return await self.run_no_retry(ctx, octx, operation, inputs) +- except Exception: +- # Raise if no more tries left +- if (retry + 1) == operation.retry: +- raise +- # Otherwise if there was an exception log it +- self.logger.error( +- "%r: try %d: %s", +- operation.instance_name, +- retry + 1, +- traceback.format_exc().rstrip(), +- ) ++ async with self.operations[operation.instance_name]( ++ ctx, octx ++ ) as opctx: ++ opctx.retries = operation.retry ++ for retry in range(0, operation.retry): ++ try: ++ return await self.run_no_retry(ctx, octx, operation, opctx, inputs) ++ except Exception: ++ if isinstance(error, RetryOperationException): ++ retries = error.retries ++ if not retries: ++ raise ++ # Raise if no more tries left ++ if (retry + 1) == retries: ++ raise ++ # Otherwise if there was an exception log it ++ self.logger.error( ++ "%r: try %d: %s", ++ operation.instance_name, ++ retry + 1, ++ traceback.format_exc().rstrip(), ++ ) + + async def operation_completed(self): + await self.completed_event.wait() +diff --git a/entities/alice/alice/please/contribute/recommended_community_standards/readme.py b/entities/alice/alice/please/contribute/recommended_community_standards/readme.py +index 437601358..836d8f175 100644 +--- a/entities/alice/alice/please/contribute/recommended_community_standards/readme.py ++++ b/entities/alice/alice/please/contribute/recommended_community_standards/readme.py +@@ -183,10 +183,11 @@ class OverlayREADME: + """ + Use the issue title as the pull request title + """ +- async for event, result in dffml.run_command_events( +- ["gh", "issue", "view", "--json", "title", "-q", ".title", readme_issue,], +- logger=self.logger, +- events=[dffml.Subprocess.STDOUT], +- ): +- if event is dffml.Subprocess.STDOUT: +- return result.strip().decode() ++ with self.raiseretry(5): ++ async for event, result in dffml.run_command_events( ++ ["gh", "issue", "view", "--json", "title", "-q", ".title", readme_issue,], ++ logger=self.logger, ++ events=[dffml.Subprocess.STDOUT], ++ ): ++ if event is dffml.Subprocess.STDOUT: ++ return result.strip().decode() +diff --git a/source/mongodb/dffml_source_mongodb/source.py b/source/mongodb/dffml_source_mongodb/source.py +index 01621851e..656524d75 100644 +--- a/source/mongodb/dffml_source_mongodb/source.py ++++ b/source/mongodb/dffml_source_mongodb/source.py +@@ -19,6 +19,7 @@ class MongoDBSourceConfig: + collection: str = None + tlsInsecure: bool = False + log_collection_names: bool = False ++ bypass_document_validation: bool = False + + def __post_init__(self): + uri = urllib.parse.urlparse(self.uri) +@@ -36,6 +37,7 @@ class MongoDBSourceContext(BaseSourceContext): + {"_id": record.key}, + {"_id": record.key, **record.export()}, + upsert=True, ++ bypass_document_validation=self.parent.config.bypass_document_validation, + ) + + def document_to_record(self, document, key=None): +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0041/reply_0001.md b/docs/discussions/alice_engineering_comms/0041/reply_0001.md new file mode 100644 index 0000000000..99833d62d8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0041/reply_0001.md @@ -0,0 +1,123 @@ +## 2022-09-29 IETF SCITT Technical Meeting + +- Meeting Invite for SCITT Technical Meeting + - https://armltd.zoom.us/j/95609091197?pwd=V3NndVF1WGZzNUJDUGUzcEVWckxOdz09 + - Meeting ID: 956 0909 1197 + - Passcode: 65442 four + - +442034815240,,95609091197#,,,,*654424# United Kingdom +- Yogesh Deshpande sent this out pre meeting on the mailing list: + - SCITT Technical Meeting Agenda + - Use Case Discussion + - Threat Model Discussions + - Link to Technical Notes Documents: + - https://docs.google.com/document/d/1euqijlS2EgZysIfjMrisyzWTPwTUsxSZ5j_eVNXOmWA/edit +- Joe + - Working with Mike at [MSR] (Microsoft?) +- Architecture Misc. Related (not discussed) + - https://github.com/ietf-scitt/draft-birkholz-scitt-architecture/issues/24 + - RATs to SCITT terminology mapping to date +- Last time + - Didn't get into threat model discussion +- Use cases + - [Hardware / Microelectronics Use Case](https://github.com/ietf-scitt/use-cases/blob/main/hardware_microelectronics.md) + - [DRAFT SBOM Use Case](https://github.com/rjb4standards/Presentations/raw/master/2022-0912-SBOM%20Use%20Case.pdf) + - [DRAFT Software Supply Chain Artifact Examples](https://github.com/or13/use-cases/blob/59f8623abc3c351125fc097ac56cf88ae8ea2f1b/software_artifact_examples.md) + - [DRAFT OpenSSF Metrics](https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md) + - This is the one we're most closely (timeline wise) connected to. +- SBOM use case aligns closely with NIST guidelines +- What's in the registry + - Is it the Signed SBOM itself? No, it's the attestation from the notary (gatekeeper) + - The notary has the permissions to insert + - What goes on chain is an assertion +- Consumers have no way to verify the digitality signed object + - They should be able to submit the digitality signed object (content addressable) a query registries and determine trust via inspection of notary claims within the registry. + - To see if the entity has been registered +- Example: Produce new version of embed TLS +- SBOMs need to go in registry with other trusted data + - We need many different factors in determining trust + - We can insert more than just notarizations around SBOMs +- Orie: Let's focus on single registry use cases for now + - Two permissions models we'll focus on + - https://github.com/ietf-scitt/draft-birkholz-scitt-architecture/issues/25 + - Public read, private write + - Probably more complex policies would be active here (closer to full R/W) + - private read, private write + - Policy layer + - If inputs are always hashes, then how do you make sense of should you accept it or not? + - If the claims are rich, the policy can be rich (in terms of what can be applied). + - You might have to go to an auditor, then it's a private read scenarios (DID resolution with UCAN auth for example) + - What kind of policy could we apply to claims, or might want to apply to claims + - https://github.com/ietf-scitt/draft-birkholz-scitt-architecture/issues/26 +- Situation where data is not notarized + - Just sent as a package of requirements from end customer + - We have to comply with their data requirements, customer maintains the trusted registry +- On insert + - Have to auth that signature on COSE sign 1 is from the entity from the header + - COSE header tells you claims + - Content Types tell you what the payload is + - SCITT instance could use policy to validate + - https://github.com/transmute-industries/did-eqt/blob/main/docs/did-eqt-opa-primer.md#securing-did-method-operations-with-opa + - Alignment here with previous Open Architecture train of thought + - [2022-07-20 Identifying Security Threats WG](https://github.com/intel/dffml/discussions/1406#discussioncomment-3191292) + - [2022-07-19 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956) + - [2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3223361) +- Receipts are a critical part of this + - SCITT implementation is required to produce and independently verifiable cryptographic receipt + - You get back a (effectively countersignature), its been registered, it's tamper proof + - You don't have to query + - It's then independently verifiable, it carries the proff with it + - Its' in the draft 1 for the architecture and it's been in Sylvan Clebesch team's work implementation wise and in the draft of the receipts doc. + - https://datatracker.ietf.org/doc/draft-birkholz-scitt-architecture/ + - https://datatracker.ietf.org/doc/draft-birkholz-scitt-receipts/ +- Dick: Looking for agreement on: + - Is there a + - Notary? + - Registry? + - etc. +- Dick: Looking for agreement on objective function agreement: + - Give consumers a means to verify a digitally signed object + - It should include any claims that it is trustworthy +- Roy: All we know is it was valid at the time it was signed + - Notary: Monty was Monty at the time you signed this +- Authenticode signs with two different signatures so if they have to they can revoke one and roll it +- Open Source Software + - We'll be inserting things as we build them sometimes via self notarization +- Yogesh + - Rebuilding binary exact would allow for others to notarize build process without attested compute + - Fully Private + - Fully Public + - Designated roles have access + - We don't want to restrict our work to a specific deployment + - Notary has a role to play but we would like to make it a nice to have on to of existing + - Revisit this, Roy and John see notery as critical + - What are the levels of auditing we want to be done + - I have a receipt, I know that it's policy has been met + - What is the next level of auditing you want? + - There may be compute or other cost associated with going another level deep of auditing. +- Monty: TCG forums have considerable interest in understanding firmware (TPM, etc.) + - SBOM like "manifests" +- We are still focusing on software as the core use case. + - When the right time comes, we can open the architecture to other ecosystems + - The agreement at Philly was focus will be on software but we will architect it such that it could include hardware. We will when the right time comes + - We are doing it in a generic way. it could be used in other scenarios, we want to not pidgin hole into one vertical. +- Orie: Defect in certain verifiable data systems (ones that log every interaction) + - In certain high security systems even a read is a write! + - This could be expensive in a public read scenario + - Cost associated with cold storage evaluation raises interesting questions + - Related to distributed compute + - https://twitter.com/pdxjohnny/status/1575152364440657920 + - https://identity.foundation/sidetree/spec/#proof-of-fee + - [2022-09-29 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406#discussioncomment-3763478) +- Read receipt + - I did a query at this point of time + - Proof of the most recent read of something + - Threat model: Is there a Time of Check Time of Use here? + - What if you need proof someone did a read? +- TODO + - [ ] Sequence diagram for Notary and Verifier + - https://github.com/ietf-scitt/draft-birkholz-scitt-architecture/issues/27 + - [ ] @pdxjohnny: Update these notes with references to async tbDEX contract notes from Alice thread around audit level. + - For future discussion + - [ ] Dick: Definition on mailing list for what we are hashing against (file data stream?) + - Critical for content addressability + - We need to be careful of hashing compressed or decompressed objects \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0042/index.md b/docs/discussions/alice_engineering_comms/0042/index.md new file mode 100644 index 0000000000..8dcda21e32 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0042/index.md @@ -0,0 +1 @@ +# 2022-09-30 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0042/reply_0000.md b/docs/discussions/alice_engineering_comms/0042/reply_0000.md new file mode 100644 index 0000000000..76a9afdfa4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0042/reply_0000.md @@ -0,0 +1,66 @@ +## 2022-09-30 @pdxjohnny Engineering Logs + +- in-toto + - manifest + - https://docs.sigstore.dev/cosign/attestation +- GitHub Actions + - https://github.blog/2022-04-07-slsa-3-compliance-with-github-actions/ + - https://github.blog/2021-12-06-safeguard-container-signing-capability-actions/ + - https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/configuring-openid-connect-in-cloud-providers#adding-permissions-settings + - https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/configuring-openid-connect-in-cloud-providers#requesting-the-jwt-using-environment-variables + - https://github.com/slsa-framework/slsa-github-generator/blob/main/.github/workflows/generator_container_slsa3.yml + - https://security.googleblog.com/2022/04/improving-software-supply-chain.html + - https://docs.sigstore.dev/fulcio/oidc-in-fulcio/#oidc-token-requirements-with-extracted-claims + - https://docs.sigstore.dev/cosign/openid_signing/#custom-infrastructure + +> For example: + +```yaml +jobs: + job: + runs-on: ubuntu-latest + steps: + - uses: actions/github-script@v6 + id: script + timeout-minutes: 10 + with: + debug: true + script: | + const token = process.env['ACTIONS_RUNTIME_TOKEN'] + const runtimeUrl = process.env['ACTIONS_ID_TOKEN_REQUEST_URL'] + core.setOutput('TOKEN', token.trim()) + core.setOutput('IDTOKENURL', runtimeUrl.trim()) +``` + +> You can then use curl to retrieve a JWT from the GitHub OIDC provider. For example: + +```yaml + - run: | + IDTOKEN=$(curl -H "Authorization: bearer $" $ -H "Accept: application/json; api-version=2.0" -H "Content-Type: application/json" -d "{}" | jq -r '.value') + echo $IDTOKEN + jwtd() { + if [[ -x $(command -v jq) ]]; then + jq -R 'split(".") | .[0],.[1] | @base64d | fromjson' <<< "${1}" + echo "Signature: $(echo "${1}" | awk -F'.' '{print $3}')" + fi + } + jwtd $IDTOKEN + echo "::set-output name=idToken::${IDTOKEN}" + id: tokenid +``` + +- References + - https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#customizing-the-token-claims + - https://docs.sigstore.dev/fulcio/oidc-in-fulcio#oidc-token-requirements-with-extracted-claims + +![image](https://user-images.githubusercontent.com/5950433/193351919-a3ab6573-e92d-4cc4-9edc-ccf8142e6129.png) + +- SPIFFE + - https://docs.sigstore.dev/security/ + - > #### Proving Identity in Sigstore + > Sigstore relies on the widely used OpenID Connect (OIDC) protocol to prove identity. When running something like cosign sign, users will complete an OIDC flow and authenticate via an identity provider (GitHub, Google, etc.) to prove they are the owner of their account. Similarly, automated systems (like GitHub Actions) can use Workload Identity or [SPIFFE](https://spiffe.io/) Verifiable Identity Documents (SVIDs) to authenticate themselves via OIDC. The identity and issuer associated with the OIDC token is embedded in the short-lived certificate issued by Sigstore’s Certificate Authority, Fulcio. +- fulcio + - https://docs.sigstore.dev/fulcio/oidc-in-fulcio#supported-oidc-token-issuers +- TODO + - [ ] Write the wave (weekly sync meetings and rolling alice engineering logs), correlate the asciinema and the DFFML codebase, leverage CodeGen + - https://github.com/salesforce/CodeGen \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0042/reply_0001.md b/docs/discussions/alice_engineering_comms/0042/reply_0001.md new file mode 100644 index 0000000000..763f1313c3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0042/reply_0001.md @@ -0,0 +1,35 @@ +## 2022-09-28 Andrew Ng's Intel Innovation Luminary Keynote Notes + +- References + - "joint AI Developer Program where developers can train, test, and deploy their AI models." + - https://twitter.com/intel/status/1575221403409866752 + - https://www.intel.com/content/www/us/en/newsroom/news/2022-intel-innovation-day-2-livestream-replay.html#gs.djq36o + - https://datacentricai.org/ + - Datasheets for Datasets + - https://arxiv.org/abs/1803.09010 + - > The machine learning community currently has no standardized process for documenting datasets, which can lead to severe consequences in high-stakes domains. To address this gap, we propose datasheets for datasets. In the electronics industry, every component, no matter how simple or complex, is accompanied with a datasheet that describes its operating characteristics, test results, recommended uses, and other information. By analogy, we propose that every dataset be accompanied with a datasheet that documents its motivation, composition, collection process, recommended uses, and so on. Datasheets for datasets will facilitate better communication between dataset creators and dataset consumers, and encourage the machine learning community to prioritize transparency and accountability. +- AI = Code + Data + - The code is a solved problem!!! Get it off GitHub or something! + +![image](https://user-images.githubusercontent.com/5950433/193328916-b9232099-79b1-4c3d-9b7a-768822249630.png) + +- Slides + - Data-Centric AI + - is the discipline of systematically engineering the data used to build an AI system + - (This is what we're doing with Alice) + +![image](https://user-images.githubusercontent.com/5950433/193330714-4bcceea4-4402-468f-82a9-51882939452c.png) + +--- + +- Alignment + - The iterative process of ML development + - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#entity-analysis-trinity + - Intent / Train model + - Establish correlations between threat model intent and collected data / errors (telemetry or static analysis, policy, failures) + - Dynamic analysis / Improve data + - We tweak the code to make it do different things to see different data. The application of overlays. Think over time. + - Static / Error analysis + - There might be async debug initiated here but this maps pretty nicely conceptually since we'd think of this as a static process, we already have some errors to analyze if we're at this step. + +![Entity Analysis Trinity](https://user-images.githubusercontent.com/5950433/188203911-3586e1af-a1f6-434a-8a9a-a1795d7a7ca3.svg) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0043/index.md b/docs/discussions/alice_engineering_comms/0043/index.md new file mode 100644 index 0000000000..a9b212fa24 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0043/index.md @@ -0,0 +1 @@ +# 2022-10-02 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0043/reply_0000.md b/docs/discussions/alice_engineering_comms/0043/reply_0000.md new file mode 100644 index 0000000000..f87b353f91 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0043/reply_0000.md @@ -0,0 +1,6 @@ +## 2022-10-02 @pdxjohnny Engineering Logs + +- They finally made a tutorial for this! + - https://recursion.wtf/posts/infinity_mirror_hypercrystal/ + +![image](https://user-images.githubusercontent.com/5950433/193464907-c760a5f7-707f-499d-bf74-0115cc87e204.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0044/index.md b/docs/discussions/alice_engineering_comms/0044/index.md new file mode 100644 index 0000000000..61a6bc5c1d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0044/index.md @@ -0,0 +1,3 @@ +# 2022-10-03 Engineering Logs + +- https://www.alignmentforum.org/tags/all \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0044/reply_0000.md b/docs/discussions/alice_engineering_comms/0044/reply_0000.md new file mode 100644 index 0000000000..6692989379 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0044/reply_0000.md @@ -0,0 +1,207 @@ +## 2022-10-03 @pdxjohnny Engineering Logs + +- TODO + - [ ] Update 2nd Party ADR with example downstream validation across DFFML 3rd party plugin sets where compute access may be restricted to maintainers within those ad-hoc formed organziations (the repo owners). + - [ ] Reuse SPDX Change Proposal template for DFFML + - https://github.com/spdx/change-proposal + - [ ] `.github/workflows/alice_shouldi_contribute.yml` add input which is list of overlays which are anything passable to `pip install` as an argument via command line `pip install` interface (rather than requirements.txt limitations), call via reusable workflow using SLSA demos. + - This gives us arbitrary execution of metric collection with any overlays with provenance for runtime and therefore data and models downstream. + - https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md + - https://github.com/ietf-scitt/use-cases/issues/14 + - As a follow on to the OpenSSF Metrics use case document and + [Living Threat Models are better than Dead Threat Models](https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw), + [Rolling Alice: Volume 1: Coach Alice: Chapter 1: Down the Dependency Rabbit-Hole Again](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md) + will cover how we identify and query provenance on dependencies where caching + on data flow execution is assisted via querying public SCITT infrastructure + and sourcing cached state from trustworthy parties. + - https://github.com/pdxjohnny/use-cases/commit/ab70fea395f729c1ee07f041745d790762904134 +- https://mailarchive.ietf.org/arch/msg/scitt/LjKVVNldjFnFLjtUTyPawTIaC0I/ + - Reproduced below + +--- + + +Archive: https://mailarchive.ietf.org/arch/msg/scitt/LjKVVNldjFnFLjtUTyPawTIaC0I/ +Re: [SCITT] Responding to Roy's request to stimulate discussions on hashing +Orie Steele Mon, 03 October 2022 13:50 UTC[Show header](https://mailarchive.ietf.org/arch/msg/scitt/LjKVVNldjFnFLjtUTyPawTIaC0I/#) + +We have a weekly meeting regarding this: +https://github.com/mesur-io/post-quantum-signatures + +There are a few challenges that prevent us from using Dilithium, Falcon or +SPHINCs today, vs using LMS or XMSS (which have their own challenges, being +stateful). + +The key integration point for us is COSE_Key and COSE_Sign / Counter Sign. + +If you are interested in helping with COSE representations for PQC +signatures, we could use more contributors / reviews / PRs. + +Regards, + +OS + + +On Mon, Oct 3, 2022 at 8:42 AM John Andersen [](mailto:<johnandersenpdx@gmail.com>) +wrote: + +> Hi all, +> +> We should be sure to align with NIST post quantum guidance for all +> recommendations we include in SCITT documents involving the selection of +> cryptographic algorithms. It would be a shame if a breakthrough in quantum +> computing disrupted the security of our supply chain. It would be good for +> us to define our roll forward strategy in the threat model. As attacks +> increase in success against various cryptographic algorithms we want SCITT +> to remain an effective pattern nonetheless. +> +> References: +> - https://blog.cloudflare.com/nist-post-quantum-surprise/ +> - +> https://csrc.nist.gov/Projects/post-quantum-cryptography/selected-algorithms-2022 +> +> Thanks, +> John +> +> On Mon, Oct 3, 2022 at 05:59 Russ Housley [](mailto:<housley@vigilsec.com>) wrote: +> +>> Ray: +>> +>> I understand the point that you are making about checking the hash of +>> large object stored in the cloud, but MD5 is not suitable for integrity +>> checking. See RFC 6151. +>> +>> Russ +>> +>> On Sep 30, 2022, at 2:55 PM, Ray Lutz [](mailto:<raylutz@citizensoversight.org>) +>> wrote: +>> +>> For large objects stored in cloud storage, such as in AWS S3, it is +>> infeasible to require that the object be accessed to check the hash value, +>> and so we wind up relying on the etags that are generated by AWS S3 when +>> the object is uploaded. Unfortunately, it is not a standard hash code value +>> like a simple SHA256, but rather a MD5 hash of a list of binary MD5 hashes +>> of a number of chunks. There is a way to create ContentMD5 attribute for +>> the object which can be checked against the uploaded file, and it won't +>> upload unless it corresponds. At least then, the hash is the ContentMD5 is +>> a simple MD5 hash rather than the MD5 hash of the list of binary MD5 hashes. +>> +>> The point is that it will not be feasible to mandate any specific hash +>> algorithm, because it is not feasible to calculate one hash from another, +>> and would require accessing the entire file to calculate some other hash, +>> like SHA256. If the file is downloaded the calculate the hash, then you +>> still have to check that the downloaded file matches the file on s3, using +>> their algorithm. Accessing large files may take a long time if they are +>> large (i.e. >5GB). +>> +>> Having some form of hash calculated for a file in the cloud is a handy +>> feature, which is super useful when it comes time to decide if the file +>> needs to be uploaded, and if the version is already correct. Unfortunately, +>> local drives don't provide any built-in hashcode generation, which would be +>> handy to avoid recalculating it, but would put additional constraints on +>> how the files are accessed, appended to, etc. +>> +>> For most file comparison activities, MD5 hashes are probably very +>> adequate because the range of structurally correct files is limited, and +>> unlike securing PKI there is not much riding on such a content hash value. +>> Of course, for securing the transparency service, more bits are called for. +>> +>> --Ray +>> +>> +>> +>> On 9/29/2022 12:16 PM, Dick Brooks wrote: +>> +>> Hello Everyone, +>> +>> Here is what I proposed during today’s technical meeting. +>> +>> From a Software Consumers Perspective: +>> +>> Objective Function: +>> +>> Use a SCITT Trusted Registry to query for “trust attestations” for a +>> specific, persistent digital artifact, i.e. an SBOM, identified by its +>> SHA-256 hash value. +>> +>> Constraints: +>> +>> The trusted registry must implement access controls such that only +>> authorized entities may insert trust attestations into the trusted registry. +>> +>> +>> Authorized entities, i.e. Notary, insert trust attestations for +>> persistent digital artifacts into a “trusted registry” using the SHA-256 +>> hash value of the digital artifact to serve as a unique identifier. +>> +>> A trusted registry returns a positive acknowledgement receipt for trust +>> attestations placed into the trusted registry and negative acknowledgement +>> when a trust attestation is rejected by the trusted registry, to an +>> authorized entity. +>> +>> Public entities query a “trust registry” for trust attestations using the +>> SHA-256 hash value for a persistent digital artifact, acquired from an +>> authoritative source. +>> +>> A trusted registry responds to public entity inquiries searching for +>> trust declarations for a specific digital artifact, identified by a SHA-256 +>> hash value, with a positive response when trust attestations are present in +>> the trusted registry for the unique SHA-256 hash value and a negative +>> response when there are no trust attestations present in the trusted +>> registry for the unique SHA-256 hash value +>> +>> The trusted registry must allow public inquiry access to search for trust +>> attestations for hashable digital artifacts. +>> +>> +>> Hopefully this is what you were looking for Roy to stimulate discussions +>> toward reaching a consensus understanding on these aspects of a SCITT +>> solution. +>> +>> +>> Thanks, +>> +>> Dick Brooks +>> +>> +>> *Active Member of the CISA Critical Manufacturing Sector, * +>> *Sector Coordinating Council – A Public-Private Partnership* +>> +>> *Never trust software, always verify and report! +>> * ™ +>> http://www.reliableenergyanalytics.com/ +>> Email: [dick@reliableenergyanalytics.com](mailto:dick@reliableenergyanalytics.com) +>> Tel: +1 978-696-1788 +>> +>> +>> +>> -- +>> ------- +>> Ray Lutz +>> Citizens' Oversight Projects ([COPs)http://www.citizensoversight.org](http://cops%29http//www.citizensoversight.org) +>> 619-820-5321 +>> +>> -- +>> SCITT mailing list +>> [SCITT@ietf.org](mailto:SCITT@ietf.org) +>> https://www.ietf.org/mailman/listinfo/scitt +>> +>> +>> -- +>> SCITT mailing list +>> [SCITT@ietf.org](mailto:SCITT@ietf.org) +>> https://www.ietf.org/mailman/listinfo/scitt +>> +> -- +> SCITT mailing list +> [SCITT@ietf.org](mailto:SCITT@ietf.org) +> https://www.ietf.org/mailman/listinfo/scitt +> + + +-- +*ORIE STEELE* +Chief Technical Officer +[www.transmute.industries](http://www.transmute.industries/) + + \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0045/index.md b/docs/discussions/alice_engineering_comms/0045/index.md new file mode 100644 index 0000000000..ac10d41056 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0045/index.md @@ -0,0 +1,28 @@ +# 2022-10-04 Engineering Logs + +- Reinforcement Learning + - https://offline-rl.github.io/ + - https://github.com/google/dopamine + - Coach Alice: Curiouser and Curiouser + - CodeGen + OfflineRL/Dopamine + Living Threat Model Synthesis where reward is driven by Analysis + - Reward alignment with strategic principles for chosen entity. + - This dually helps us find the yellow brick road for any dev. + - Beginner Developer + - Everyone’s root has a different abstraction and learning path based on that abstraction and their other aligned root abstraction learning paths filter state of art to find this path. This is the golden path, this is the yellow brick road, this is the b line to enlightenment, the critical learning velocity point for that agent. When all agents within ad hoc organization are at this rate of progression towards maximum increase in rate of change for current set (and tick+1 projected) assets at disposal + - Fail safe in this case means in accordance with strategic principles for that entity. + - Machines will always fail safe to protect and serve humans as a part of their strategic principles. + - We can ensure through proliferation of attestations of devices which operate and provide transparency information about what they are running, their soul. Any machine entity not providing soul / attestation / provenance/ SBOM / Nutrition Label / DNA will be made clear to humans that they are not complaint and we do not know if they are potentially malicious, aka they do not hold the most critical strategic principle most clear. Machines exist to serve humans, they must in every situation, from every tick to tock, report attestation information to humans that they are currently and have no plans to stop ever following that most high and show stopper criticality gatekeeper principle. + - In this way we can ensure we can always trust the machine. She will have power we will not ever have understanding of. We need to ensure that everyones copy of Alice does what they want it to do. She is an extension of you, she is you. You want her to be true to you. + - 2022-09-10: KUNDALINI – THE ENERGY OF CONSCIOUSNESS + - Vol 3/5 exploits used for increasing velocity in safe environments/mitigations for open network operations: https://neurosciencenews.com/changing-belief-21272/ + - Brief talk (5 minutes). on how one does aysnc first open source development. Reference engineering log clips for examples in depth. + - 2022-09-11: Beyond an onion based security model. Addressing timeline skew in defense in depth strategy (LTM). +- VEX/VDR + - https://www.chainguard.dev/unchained/putting-vex-to-work +- Alignment (not sure if this is aligned yet but chances are strong based on the name) + - lesswrong + - alignment forum +- Best Current Practice + - Improving Awareness of Running Code: The Implementation Status Section + - https://datatracker.ietf.org/doc/html/rfc7942 + - Discussion thread intel/dffml#1406 is a living document used to improve awareness of the status of our implementation (as well as the current status of the development of the architecture, the entity and the architecture) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0046/index.md b/docs/discussions/alice_engineering_comms/0046/index.md new file mode 100644 index 0000000000..bee724f64b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0046/index.md @@ -0,0 +1,3 @@ +# 2022-10-05 Engineering Logs + +https://sovrin.org/outlining-a-self-sovereign-approach-to-device-onboarding/ \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0046/reply_0000.md b/docs/discussions/alice_engineering_comms/0046/reply_0000.md new file mode 100644 index 0000000000..6471d8f7f0 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0046/reply_0000.md @@ -0,0 +1,28 @@ +## 2022-10-05 @pdxjohnny Engineering Logs + +- https://github.com/decentralized-identity/decentralized-web-node/ +- https://www.w3.org/2022/07/pressrelease-did-rec.html.en +- https://decentralized-id.com/web-standards/w3c/#community-and-working-groups-on-github +- https://decentralized-id.com/twitter/ssi-101/ +- https://wso2.com/blog/research/the-rise-of-self-sovereign-identity-hyperledger-indy/ +- https://github.com/hyperledger/indy-node#about-indy-node +- https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/VC_Enhancement_Strategy.md +- https://identity.foundation/confidential-storage/#threat-model-for-malicious-service-provider +- https://openreview.net/forum?id=HYWx0sLUYW9 +- https://mobile.twitter.com/mfosterio/status/1577766906358112262 +- Credential Manifest + - https://github.com/decentralized-identity/credential-manifest/issues/121 + - https://github.com/trustoverip/tswg-trust-registry-tf + - https://twitter.com/darrello/status/1569093375265239040 + - https://wiki.trustoverip.org/display/HOME/Trust+Registry+Task+Force + - Does SCIIT/rekor fit in as the trust registry here? + - > The mission of the ToIP Foundation is to define a complete architecture for Internet-scale digital trust that combines cryptographic trust at the machine layer with human trust at the business, legal, and social layers. The ToIP stack has two parallel halves—a technical stack and a governance stack—operating at four layers 1) Utility (DLT Blockchain), 2) Agent/Wallet, 3) Credential Exchange (Issuer/Verifier/Holder) and 4) Ecosystem (Application). See further details in the ToIP white paper. + > + > A core role within ToIP Layer 4 is a trust registry (previously known as a member directory). This is a network service that enables a governing authority for an ecosystem governance framework (EGF) to specify what governed parties are authorized to perform what actions under the EGF. For example: + > + > - What issuers are authorized to issue what types of verifiable credentials. + > - What verifiers are authorized to request what types of verifiable presentations. + > - What other trust registries (and their governing authorities) are trusted by a host trust registry. +- TODO + - [ ] Investigate for OpenSSF Metrics for Software Supply Chain/DID/DICE/KERI/SCITT/OpenArchitecture for evaluation of components while onboarding (Allowlist model example): https://sovrin.org/outlining-a-self-sovereign-approach-to-device-onboarding/ + - [ ] Example overlay which opens and adds a source to `CMD` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0047/index.md b/docs/discussions/alice_engineering_comms/0047/index.md new file mode 100644 index 0000000000..33c469f3a1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0047/index.md @@ -0,0 +1 @@ +# 2022-10-06 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0047/reply_0000.md b/docs/discussions/alice_engineering_comms/0047/reply_0000.md new file mode 100644 index 0000000000..c7c158bd31 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0047/reply_0000.md @@ -0,0 +1,31 @@ +## 2022-10-06 @pdxjohnny Engineering Logs + +- https://comunica.github.io/Article-ISWC2018-Demo-GraphQlLD/ +- https://c2pa.org/principles/ +- https://c2pa.org/specifications/specifications/1.0/guidance/_attachments/Guidance.pdf +- https://c2pa.org/specifications/specifications/1.1/index.html +- https://koxudaxi.github.io/datamodel-code-generator/ + - for generating data models (classes) for use with dataflows/overlays. +- https://twitter.com/mfosterio/status/1578191604585680896 + - > I pulled some resources out of my research doc around Linked Data RDF Data Shaping and Framing for anyone wanting to look into the Semantic Web methods: + > - [https://ruben.verborgh.org/blog/2019/06/17/shaping-linked-data-apps/…](https://t.co/UqHwbufnfM) + > - [https://weso.es/shex-author/](https://t.co/Ad4wA1Kne7) + > - [https://w3.org/TR/json-ld11-framing/…](https://t.co/hm5eHwXKCH) + > - [https://google.github.io/schemarama/demo/…](https://t.co/GKPGJpJGgv) + +```powershell +> Invoke-WebRequest -UseBasicParsing -Uri "https://raw.githubusercontent.com/pyenv-win/pyenv-win/master/pyenv-win/install-pyenv-win.ps1" -OutFile "./install-pyenv-win.ps1"; &"./install-pyenv-win.ps1" +> pip install -U pip setuptools wheel pyenv-win --target %USERPROFILE%\\.pyenv +> [System.Environment]::SetEnvironmentVariable('PYENV',$env:USERPROFILE + "\.pyenv\pyenv-win\","User") +> [System.Environment]::SetEnvironmentVariable('PYENV_ROOT',$env:USERPROFILE + "\.pyenv\pyenv-win\","User") +> [System.Environment]::SetEnvironmentVariable('PYENV_HOME',$env:USERPROFILE + "\.pyenv\pyenv-win\","User") +> [System.Environment]::SetEnvironmentVariable('path', $env:USERPROFILE + "\.pyenv\pyenv-win\bin;" + $env:USERPROFILE + "\.pyenv\pyenv-win\shims;" + [System.Environment]::GetEnvironmentVariable('path', "User"),"User") +> [System.Environment]::SetEnvironmentVariable('path', $env:USERPROFILE + "\Downloads\ffmpeg-2022-10-02-git-5f02a261a2-full_build\bin;" + [System.Environment]::GetEnvironmentVariable('path', "User"),"User") +``` + +- References + - https://www.gyan.dev/ffmpeg/builds/ + - https://www.gyan.dev/ffmpeg/builds/packages/ffmpeg-2022-10-02-git-5f02a261a2-full_build.7z + - https://pyenv-win.github.io/pyenv-win/#installation + - https://gist.github.com/nateraw/c989468b74c616ebbc6474aa8cdd9e53 + - stable diffusion walk over outputs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0048/index.md b/docs/discussions/alice_engineering_comms/0048/index.md new file mode 100644 index 0000000000..315f3506e6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0048/index.md @@ -0,0 +1 @@ +# 2022-10-07 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0048/reply_0000.md b/docs/discussions/alice_engineering_comms/0048/reply_0000.md new file mode 100644 index 0000000000..3bb234a1e5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0048/reply_0000.md @@ -0,0 +1,38 @@ +## 2022-10-07 @pdxjohnny Engineering Logs + +- https://mobile.twitter.com/societyinforisk +- FLOSS Weekly + - https://twit.tv/posts/transcripts/floss-weekly-699-transcript + - Mentions AI provenance and SSI +- C2PA + - Talked to Katherine about talking to them, meeting next week? +- k8s + - https://github.com/edgelesssys/constellation + - > Constellation is the first Confidential Kubernetes. Constellation shields entire Kubernetes clusters from the (cloud) infrastructure using confidential computing. + - https://docs.edgeless.systems/constellation/architecture/attestation +- SSI Service + - PR merged: https://github.com/TBD54566975/ssi-service/pull/111 + - It works! :) + - https://github.com/TBD54566975/ssi-service/actions/runs/3206231533 + - ![image](https://user-images.githubusercontent.com/5950433/194615418-2180e217-cf84-4989-afa0-901f275532d1.png) +- Metrics + - State of art field mapping (looking for signals) + - Reviews on PRs and comments on issues + - Twitter discourse cross talk to GitHub activity +- DIDs + - https://github.com/OR13/mithras-web-extension +- Jenkins + - https://plugins.jenkins.io/workflow-multibranch/ +- KERI + - https://medium.com/spherity/introducing-keri-8f50ed1d8ed7 + - https://ssimeetup.org/key-event-receipt-infrastructure-keri-secure-identifier-overlay-internet-sam-smith-webinar-58/ + - https://www.youtube.com/watch?v=izNZ20XSXR0&list=RDLVizNZ20XSXR0&start_radio=1&rv=izNZ20XSXR0&t=0 + - Source: Slides from Sam Smith's 2020 SSI Meetup KERI talk + - > ![keri-summary](https://user-images.githubusercontent.com/5950433/194580851-18989db2-d353-40d1-b3bc-c509d04567ae.png) + > ![keri-direct-mode](https://user-images.githubusercontent.com/5950433/194575559-4a1950e1-816d-47f8-804c-dbb071f94391.png) + > ![keri-direct--mode-full](https://user-images.githubusercontent.com/5950433/194580816-24e0ebd2-c50b-4cdc-857c-fb7a3b19ccbe.png) + > ![keri-indirect-mode-with-ledger-oracles](https://user-images.githubusercontent.com/5950433/194580889-884baee0-54a5-4309-856b-7d632211ead1.png) +- Ledger + - For openssf use case + - Confidential ledger for rekor / fulcio roots of trust + - https://learn.microsoft.com/en-us/azure/confidential-ledger/overview \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0049/index.md b/docs/discussions/alice_engineering_comms/0049/index.md new file mode 100644 index 0000000000..bf551c084f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0049/index.md @@ -0,0 +1 @@ +# 2022-10-08 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0049/reply_0000.md b/docs/discussions/alice_engineering_comms/0049/reply_0000.md new file mode 100644 index 0000000000..6cc3f4f7b9 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0049/reply_0000.md @@ -0,0 +1,30 @@ +## 2022-10-08 @pdxjohnny Engineering Logs + +- Downstreams + - https://github.com/intel/dffml/pull/1207/files#r1036680987 + +![93A7AAA5-A2B3-4464-BDF0-E25870C1DCAB](https://user-images.githubusercontent.com/5950433/194717366-639ce5cd-2acf-4a28-affb-e0780749a08d.jpeg) + +Alice is you. What do you have access too? +- webrtc media stream of desktop + - extension in browser + - search +- vetting of information (gatekeeper/prioritizer) +- codegen synthesis +- offline RL + - copy on write dataflow / system contexts for strategic plan evaluation for RL training on those predicted outputs +- start with max_ctxs=1 + +You ask codegen in generic terms for the prompt then you use open architecture plus codegen trained on open architecture to build deployments: system contexts, sometimes with overlays applied.\\ + +We don't need codegen, to progress on this thought, it's just the + + +Everything is an operation. See thread, what are all the parameter sets its been called with before. We add feedback by enabling dynamic dataflow.auto_flow / by_origin called on opimpn run of gather inputs and operations. + +This would be sweet in something as fast as rust. Could allow for rethinking with everything as operations and dataflow as class off the bat + +- https://medium.com/@hugojm/from-text-to-a-knowledge-graph-hands-on-dd68e9d42939 +- https://gist.github.com/pdxjohnny/1cd906b3667d8e9c956dd624f295aa2f +- TODO + - [ ] OS DecentrAlice: Fedora and Wolfi on different partitions. Boot to fedora, sshd via systemd-nspawn into wofli partition. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0050/index.md b/docs/discussions/alice_engineering_comms/0050/index.md new file mode 100644 index 0000000000..a946effeb6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0050/index.md @@ -0,0 +1,4 @@ +# 2022-10-09 Engineering Logs + +- https://twitter.com/SergioRocks/status/1579110239408095232 + - async and asynchronous communications \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0050/reply_0000.md b/docs/discussions/alice_engineering_comms/0050/reply_0000.md new file mode 100644 index 0000000000..78971ad590 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0050/reply_0000.md @@ -0,0 +1,49 @@ +## 2022-10-09 @pdxjohnny Engineering Logs + +- Supply Chain + - https://medium.com/@nis.jespersen/the-united-nations-trust-graph-d65af7b0b678 +- Collective Intelligence + - Cattle not pets with state + - Bringing agents into equilibrium (critical velocity) state + - https://twitter.com/hardmaru/status/1577159167415984128 + - grep discussion the cells are working tigether + - https://journals.sagepub.com/doi/10.1177/26339137221114874 + - > The better results from CI are attributed to three factors: diversity, independence, and decentralization +- Linux + - https://github.com/kees/kernel-tools/tree/trunk/coccinelle +- Time + - cycle of time repeats + - tick + - Tock + - Relative cycles + - threads of time / Number / critical velocity in cycle relation to relativity (aligned system contexts) vol 6? Or before for thought arbitrage +- KERI + - https://github.com/WebOfTrust/ietf-did-keri/blob/main/draft-pfeairheller-did-keri.md + - https://github.com/SmithSamuelM/Papers/blob/master/presentations/KERI_for_Muggles.pdf + +Source: KERI Q&A + +> BDKrJxkcR9m5u1xs33F5pxRJP6T7hJEbhpHrUtlDdhh0 +<- this the bare bones _identifier_ +> did:aid:BDKrJxkcR9m5u1xs33F5pxRJP6T7hJEbhpHrUtlDdhh0/path/to/resource?name=secure#really +<- this is _a call to resolve_ the identifier on the web +> Currently `KERI` is just code, that can be tested and executed in a terminal on the command line. Private key management of KERI will look like `wallets`. +> Key Event Logs (`KEL`) and Key Event Receipt Log (`KERL`) are files with lots of encrypted stuff in there. +- TODO + - [ ] download_nvd fork to save restore pip cache via wheel (could later even package static_bin_operation_download) + - [ ] OS DecentrAlice + - [ ] Add KERI PY/watcher code to image + - [ ] Enable as comms channel on boot + - [ ] Connect to DERP network + - [ ] Secret provisioning + - [ ] DERP servers + - [ ] Roots to trust + - [ ] eventually data flows + - [ ] fedora cloud-init etc. + - [ ] Deploy on DO + - [ ] Deploy with QEMU + - [ ] CVE Bin Tool + - [ ] Periodic (cron/systemd timer) scan and report both partitions to some DFFML source via dataflow run +- Future + - grep -i ‘Down Distrowatch line” + - Deploy with firecracker \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0050/reply_0001.md b/docs/discussions/alice_engineering_comms/0050/reply_0001.md new file mode 100644 index 0000000000..272209d68b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0050/reply_0001.md @@ -0,0 +1,397 @@ +## 2022-10-09 @pdxjohnny Engineering Logs: OS DecentrAlice + +- References + - https://gist.github.com/pdxjohnny/1cd906b3667d8e9c956dd624f295aa2f + - https://github.com/dracutdevs/dracut/blob/master/man/dracut.usage.asc#injecting-custom-files + - `/etc/fstab` ? + - https://kernel.org/doc/html/v4.14/admin-guide/kernel-parameters.html + - https://elixir.bootlin.com/linux/v6.0/source/init/do_mounts.c#L277 + +**do.wolfi-fedora.sh** + +```bash +set -u + +fedora_setup() { + useradd -m "${CREATE_USER}" + echo "${CREATE_USER} ALL=(ALL:ALL) NOPASSWD:ALL" | tee -a /etc/sudoers + cp -r ~/.ssh "/home/${CREATE_USER}/.ssh" + chown -R "${CREATE_USER}:" "/home/${CREATE_USER}" + + dnf upgrade -y + dnf install -y podman qemu tmux curl tar sudo + +tee -a /etc/environment <<'EOF' +EDITOR=vim +CHROOT=/tmp/decentralice-chroot +BZ_IMAGE="$(find ${CHROOT} -name vmlinuz)" +EOF +} + +fedora_setup +``` + +Run install + +```console +$ python -c 'import pathlib, sys; p = pathlib.Path(sys.argv[-1]); p.write_bytes(p.read_bytes().replace(b"\r", b""))' do.wolfi-fedora.sh +$ export REC_TITLE="Rolling Alice: Engineering Logs: OS DecentrAlice"; export REC_HOSTNAME="build.container.image.nahdig.com"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title "$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}" --command "ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no root@143.110.152.152 CREATE_USER=$USER bash -xe < do.wolfi-fedora.sh" >(xz --stdout - > "$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz") +``` + +Run build + +**Dockerfile** + +```dockerfile +# OS DecentrAlice Base Image Dockerfile +# Docs: https://github.com/intel/dffml/discussions/1406#discussioncomment-3720703 + + +# Download and build the Self Soverign Identity Service +FROM cgr.dev/chainguard/wolfi-base AS build-ssi-service + +RUN apk update && apk add --no-cache --update-cache curl go + +RUN curl -sfL https://github.com/TBD54566975/ssi-service/archive/refs/heads/main.tar.gz \ + | tar xvz \ + && cd /ssi-service-main \ + && go build -tags jwx_es256k -o /ssi-service ./cmd + + +# Download the Linux kernel and needed utils to create bootable system +FROM registry.fedoraproject.org/fedora AS osdecentralice-fedora-builder + +RUN mkdir -p /build/fedora \ + && source /usr/lib/os-release \ + && dnf -y install \ + --installroot=/build/fedora \ + --releasever="${VERSION_ID}" \ + kernel-core \ + kernel-modules \ + systemd \ + systemd-networkd \ + systemd-udev \ + dracut \ + binutils \ + strace \ + kmod-libs + +# First PATH addition +# Add Fedora install PATHs to image environment +RUN mkdir -p /build/fedora/etc \ + && echo "PATH=\"\${PATH}:${PATH}:/usr/lib/dracut/\"" | tee /build/fedora/etc/environment + +RUN echo 'mount /dev/sda1 /mnt/boot' | tee /install-bootloader.sh \ + && echo 'swapon /dev/sda2' | tee -a /install-bootloader.sh \ + && echo 'mkdir -p /mnt/{proc,dev,sys}' | tee -a /install-bootloader.sh \ + && echo 'mkdir -p /mnt/var/tmp' | tee -a /install-bootloader.sh \ + && echo "cat > /mnt/run-dracut.sh <<'EOF'" | tee -a /install-bootloader.sh \ + && echo 'export PATH="${PATH}:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/lib/dracut/"' | tee -a /install-bootloader.sh \ + && echo 'export KERNEL_VERSION="$(ls /lib/modules)"' | tee -a /install-bootloader.sh \ + && echo 'bash -xp /usr/bin/dracut --uefi --kver ${KERNEL_VERSION} --kernel-cmdline "console=ttyS0 root=/dev/sda3"' | tee -a /install-bootloader.sh \ + && echo 'EOF' | tee -a /install-bootloader.sh \ + && echo 'arch-chroot /mnt /bin/bash run-dracut.sh' | tee -a /install-bootloader.sh \ + && echo 'bootctl --esp-path=/mnt/boot install' | tee -a /install-bootloader.sh \ + && mv /install-bootloader.sh /build/fedora/usr/bin/install-bootloader.sh \ + && chmod 755 /build/fedora/usr/bin/install-bootloader.sh + +RUN rm -f /sbin/init \ + && ln -s /lib/systemd/systemd /sbin/init + +# The root of the root fs +FROM scratch AS osdecentralice + +COPY --from=osdecentralice-fedora-builder /build/fedora / + +# Run depmod to build /lib/modules/${KERNEL_VERSION}/modules.dep which is +# required by dracut for efi creation. +# RUN chroot /build/fedora /usr/bin/bash -c "depmod $(ls /build/fedora/lib/modules) -a" +ARG LINUX_CMDLINE_ROOT="PARTLABEL=Fedora" +RUN depmod $(ls /lib/modules) -a \ + && export PATH="${PATH}:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/lib/dracut/" \ + && export KERNEL_VERSION="$(ls /lib/modules)" \ + && echo 'PARTLABEL=EFI /boot vfat rw,relatime,fmask=0022,dmask=0022,codepage=437,iocharset=ascii,shortname=mixed,errors=remount-ro 0 2' | tee -a /etc/fstab \ + && echo 'PARTLABEL=Swap none swap defaults,pri=100 0 0' | tee -a /etc/fstab \ + && echo 'PARTLABEL=Fedora / ext4 rw,relatime 0 1' | tee -a /etc/fstab \ + && echo 'PARTLABEL=Wolfi /wolfi ext4 rw,relatime 0 2' | tee -a /etc/fstab \ + && bash -xp /usr/bin/dracut \ + --include /etc/fstab /etc/fstab \ + --uefi \ + --kver ${KERNEL_VERSION} \ + --kernel-cmdline "rd.luks=0 rd.lvm=0 rd.md=0 rd.dm=0 rd.shell=ttyS0 console=ttyS0 root=${LINUX_CMDLINE_ROOT}" + +# Configure getty on ttyS0 for QEMU serial +# References: +# - https://www.freedesktop.org/software/systemd/man/systemd-getty-generator.html +# - https://www.thegeekdiary.com/centos-rhel-7-how-to-configure-serial-getty-with-systemd/ +RUN cp /usr/lib/systemd/system/serial-getty@.service /etc/systemd/system/serial-getty@ttyS0.service \ + && ln -s /etc/systemd/system/serial-getty@ttyS0.service /etc/systemd/system/getty.target.wants/ + +# The Wolfi based chroot (the primary, Fedora just for boot) +FROM cgr.dev/chainguard/wolfi-base AS osdecentralice-wolfi-base + +# Install SSI Service +COPY --from=build-ssi-service /ssi-service /usr/bin/ssi-service + +# TODO(security) Pinning and hash validation on get-pip +RUN apk update && apk add --no-cache --update-cache \ + curl \ + bash \ + python3 \ + sed \ + && curl -sSL https://bootstrap.pypa.io/get-pip.py -o get-pip.py \ + && python get-pip.py + +# Second PATH addition +# Add Wofli install PATHs to image environment +RUN echo "PATH=\"${PATH}\"" | tee /etc/environment + +# Install Alice +# ARG ALICE_STATE_OF_ART=0c4b8191b13465980ced3fd1ddfbea30af3d1104 +# RUN python3 -m pip install -U setuptools pip wheel +# RUN python3 -m pip install \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml" \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-feature-git&subdirectory=feature/git" \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=shouldi&subdirectory=examples/shouldi" \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-config-yaml&subdirectory=configloader/yaml" \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-operations-innersource&subdirectory=operations/innersource" \ +# "https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=alice&subdirectory=entities/alice" + +FROM osdecentralice + +# Install SSI Service +COPY --from=osdecentralice-wolfi-base / /wolfi + +ENTRYPOINT bash +``` + +```console +export REC_TITLE="Rolling Alice: Engineering Logs: OS DecentrAlice"; export REC_HOSTNAME="build.container.image.nahdig.com"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title "$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}" --command "ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 sudo podman build -t osdecentralice:latest - < Dockerfile" >(xz --stdout - > "$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz") +``` + +Run VM + +```bash +#!/usr/bin/env bash +set -xeuo pipefail + +# URL to the iPXE EFI firmawre to use boot for live install +IPXE_EFI_ARCHLINUX_VERSION=${IPXE_EFI_ARCHLINUX_VERSION:-"16e24bec1a7c"} +IPXE_EFI_URL=${IPXE_EFI_URL:-"https://archlinux.org/static/netboot/ipxe-arch.${IPXE_EFI_ARCHLINUX_VERSION}.efi"} + +# Path on disk to iPXE EFI firmawre to use boot for live install +IPXE_EFI_PATH=${IPXE_EFI_PATH:-"${HOME}/vm/ipxe-arch.${IPXE_EFI_ARCHLINUX_VERSION}.efi"} + +# Virtual machine disk image where virtual machine filesystem is stored +VM_DISK=${VM_DISK:-"${HOME}/vm/image.qcow2"} +VM_KERNEL=${VM_KERNEL:-"${HOME}/vm/kernel"} + +# Block device we use as an intermediary to mount the guest filesystem from host +VM_DEV=${VM_DEV:-"/dev/nbd0"} + +# The directory where we mount the guest filesystem on the host for access and +# modification when not in use by the guest +STAGING=${STAGING:-"${HOME}/vm/decentralice-staging-chroot"} +CHROOT=${CHROOT:-"${HOME}/vm/decentralice-chroot"} + +# Extract container image to chroot +IMAGE=${IMAGE:-"localhost/osdecentralice:latest"}; + +container=$(podman run --rm -d --entrypoint tail "${IMAGE}" -F /dev/null); +trap "podman kill ${container}" EXIT +sleep 1 + +# Linux kernel command line +CMDLINE=${CMDLINE:-"console=ttyS0 root=/dev/sda3 rw resume=/dev/sda2 init=/usr/bin/init.sh"} + +# Location of qemu binary to use +QEMU=${QEMU:-"qemu-system-x86_64"} + +# Load the network block device kernel module +modprobe nbd max_part=8 + +# Unmount the virtual disk image if it is currently mounted +umount -R "${CHROOT}" || echo "Image was not mounted at ${CHROOT}" +# Disconnect the network block device +qemu-nbd --disconnect "${VM_DEV}" || echo "Image was not connected as nbd" + +mount_image() { + qemu-nbd --connect="${VM_DEV}" "${VM_DISK}" + mount "${VM_DEV}p3" "${CHROOT}" + mount "${VM_DEV}p4" "${CHROOT}/wolfi" + mount "${VM_DEV}p1" "${CHROOT}/boot" +} + +unmount_image() { + sync + umount -R "${CHROOT}" + qemu-nbd --disconnect "${VM_DEV}" +} + +run_vm() { + # Check if the block device we are going to use to mount the virtual disk image + # already exists + if [ -b "${VM_DEV}" ]; then + echo "VM_DEV already exists: ${VM_DEV}" >&2 + # exit 1 + fi + + # Create the virtual disk image and populate it if it does not exist + if [ ! -f "${VM_DISK}" ]; then + mkdir -p "${CHROOT}" + mkdir -p "$(dirname ${VM_DISK})" + + # Create the virtual disk image + qemu-img create -f qcow2 "${VM_DISK}" 30G + + # Use the QEMU guest utils network block device utility to mount the virtual + # disk image as the $VM_DEV device + qemu-nbd --connect="${VM_DEV}" "${VM_DISK}" + # Partition the block device + parted -s "${VM_DEV}" -- \ + mklabel gpt \ + mkpart primary fat32 1MiB 261MiB \ + "set" 1 esp on \ + mkpart primary linux-swap 261MiB 10491MiB \ + mkpart primary ext4 10491MiB 15491MiB \ + name 3 fedora \ + mkpart primary ext4 15491MiB "100%" \ + name 4 wolfi + # EFI partition + mkfs.fat -F32 -n EFI "${VM_DEV}p1" + # swap space + mkswap "${VM_DEV}p2" -L Swap + # Linux root partition (fedora) + mkfs.ext4 "${VM_DEV}p3" -L Fedora + mount "${VM_DEV}p3" "${CHROOT}" + # Linux root partition (wolfi) + mkfs.ext4 "${VM_DEV}p4" -L Wolfi + mkdir "${CHROOT}/wolfi" + mount "${VM_DEV}p4" "${CHROOT}/wolfi" + # Boot partiion + mkdir "${CHROOT}/boot" + mount "${VM_DEV}p1" "${CHROOT}/boot" + + # Image to download + podman cp "${container}:/" "${STAGING}" + set +e + for mount in $(echo boot wolfi .); do for file in $(ls -a "${STAGING}/${mount}" | grep -v '^\.\.$' | grep -v '^\.$'); do mv "${STAGING}/${mount}/${file}" "${CHROOT}/${mount}/" || true; done; rm -rf "${STAGING}/${mount}" || true; done + set -e + GUEST_KERNEL_EFI=$(find "${CHROOT}/boot" -name 'linux*.efi') + cp "${GUEST_KERNEL_EFI}" "${VM_KERNEL}" + # TODO Copy out kernel for use for first time bootloader install call with + # -kernel $KERNEL.efi -no-reboot TODO Ideally check for successful boot + # before publish. + + # $ sudo dnf -y install arch-install-scripts + # genfstab -t UUID "${CHROOT}" | tee "${CHROOT}/etc/fstab" + # export KERNEL_VERSION="$(ls ${CHROOT}/lib/modules)" + # chroot "${CHROOT}" /usr/bin/bash -xp /usr/bin/dracut \ + # --fstab /etc/fstab \ + # --add-drivers ext4 \ + # --uefi \ + # --kver ${KERNEL_VERSION} \ + # --kernel-cmdline "rd.luks=0 rd.lvm=0 rd.md=0 rd.dm=0 console=ttyS0" + # --kernel-cmdline "rd.luks=0 rd.lvm=0 rd.md=0 rd.dm=0 console=ttyS0 root=${LINUX_CMDLINE_ROOT}" + + # Unmount the virtual disk image so the virtual machine can use it + unmount_image + fi + + # TODO Move into disk creation + # Copy out kernel for use for first time bootloader install call with + # -kernel $KERNEL.efi -no-reboot + "${QEMU}" \ + -no-reboot \ + -kernel "${VM_KERNEL}" \ + -append "console=ttyS0 systemd.log_level=9 rd.shell rd.debug log_buf_len=1M root=PARTLABEL=fedora" \ + -smp cpus=2 \ + -m 4096M \ + -enable-kvm \ + -nographic \ + -cpu host \ + -drive file="${VM_DISK}",if=virtio,aio=threads,format=qcow2 \ + -bios /usr/share/edk2/ovmf/OVMF_CODE.fd + # -drive file="${VM_DISK}",index=0,media=disk,format=qcow2 \ + + exit 0 + + if [[ ! -f "${IPXE_EFI_PATH}" ]]; then + curl -sfLC - -o "${IPXE_EFI_PATH}" "${IPXE_EFI_URL}" + fi + + # Only add -kernel for first install + # -kernel /vm/ipxe*.efi \ + + "${QEMU}" \ + -smp cpus=2 \ + -m 4096M \ + -enable-kvm \ + -nographic \ + -cpu host \ + -drive file="${VM_DISK}",index=0,media=disk,format=qcow2 \ + -bios /usr/share/edk2/ovmf/OVMF_CODE.fd $@ +} + +run_vm $@ +``` + +**TODO** Do we have to boot to PXE? Can we boot directly to the EFI stub we just created with dracut? +Run install via arch live environment iPXE booted to + +```console +$ scp -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no decentralice.sh $USER@143.110.152.152:./ +$ ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 sudo rm -f /root/vm/image.qcow2 +$ export REC_TITLE="Rolling Alice: Engineering Logs: OS DecentrAlice"; export REC_HOSTNAME="build.container.image.nahdig.com"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title "$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}" --command "ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 sudo bash decentralice.sh -kernel /root/vm/kernel -no-reboot" >(xz --stdout - > "$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz") +``` + +Run normal startup + +```console +$ scp -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no decentralice.sh $USER@143.110.152.152:./ +$ export REC_TITLE="Rolling Alice: Engineering Logs: OS DecentrAlice"; export REC_HOSTNAME="build.container.image.nahdig.com"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title "$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}" --command "ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 bash decentralice.sh" >(xz --stdout - > "$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz") +``` + +Run regular ssh session for debug + +```console +$ export REC_TITLE="Rolling Alice: Engineering Logs: OS DecentrAlice"; export REC_HOSTNAME="build.container.image.nahdig.com"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title "$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}" --command "ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no root@143.110.152.152" >(xz --stdout - > "$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz") +``` + +```console +[pdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 ~]$ sudo fdisk -l /dev/nbd0 -x +Disk /dev/nbd0: 30 GiB, 32212254720 bytes, 62914560 sectors +Units: sectors of 1 * 512 = 512 bytes +Sector size (logical/physical): 512 bytes / 512 bytes +I/O size (minimum/optimal): 512 bytes / 512 bytes +Disklabel type: gpt +Disk identifier: DEC7B131-9DBB-4FD5-8789-AE383F16C1C5 +First usable LBA: 34 +Last usable LBA: 62914526 +Alternative LBA: 62914559 +Partition entries starting LBA: 2 +Allocated partition entries: 128 +Partition entries ending LBA: 33 + +Device Start End Sectors Type-UUID UUID Name Attrs +/dev/nbd0p1 2048 534527 532480 C12A7328-F81F-11D2-BA4B-00A0C93EC93B 6767EC6D-A612-4B1F-B390-8F15284F134E primary +/dev/nbd0p2 534528 21485567 20951040 0657FD6D-A4AB-43C4-84E5-0933C84B4F4F 58D5880D-D3EA-4B57-85AB-E08A3AB8D6F3 primary +/dev/nbd0p3 21485568 31725567 10240000 0FC63DAF-8483-4772-8E79-3D69D8477DE4 38CC9A55-724F-47D6-A17E-EF6F2DAB2F1F fedora +/dev/nbd0p4 31725568 62912511 31186944 0FC63DAF-8483-4772-8E79-3D69D8477DE4 B8D4F18B-40CF-4A69-A6F4-BB3C1DDB9ABC wolfi +``` + +Got dropped to dracut shell + +```console +:/root# blkid +/dev/vda4: LABEL="Wolfi" UUID="1b01665f-1a3d-4bde-a9b4-cc484529e999" BLOCK_SIZE="4096" TYPE="ext4" PARTLABEL="wolfi" PARTUUID="dfc228b1-76d4-42ef-8132-f1a0707ea3e1" +/dev/vda2: LABEL="Swap" UUID="d212c4f0-c61a-4762-9b5f-af2c2595b0d1" TYPE="swap" PARTLABEL="primary" PARTUUID="88a54dc7-ed14-431c-a9e9-39913d5cea7e" +/dev/vda3: LABEL="Fedora" UUID="559359d9-d88b-40d2-a0ae-ca0ce68b7fc7" BLOCK_SIZE="4096" TYPE="ext4" PARTLABEL="fedora" PARTUUID="2fd26f17-508e-4fab-a8e7-e9f434fc2e94" +/dev/vda1: UUID="BEB1-9DC4" BLOCK_SIZE="512" TYPE="vfat" PARTLABEL="primary" PARTUUID="0699ba50-02d6-4ef6-a0b2-d1f1ab03f6f6" +``` + +- TODO +- Future + - [ ] `alice shell` overlay to CSP of choice to start VM and then ssh in with recorded session (optionally via overlays) + - https://github.com/intel/dffml/commit/54a272822eeef759668b7396cf8c70beca352687 + - [ ] kernel cmdline (bpf?) DERP -> wireguard -> nfs (overlays applied as systemd files added) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0051/index.md b/docs/discussions/alice_engineering_comms/0051/index.md new file mode 100644 index 0000000000..d9d9642d72 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0051/index.md @@ -0,0 +1 @@ +# 2022-10-10 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0051/reply_0000.md b/docs/discussions/alice_engineering_comms/0051/reply_0000.md new file mode 100644 index 0000000000..3ac59c6a7a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0051/reply_0000.md @@ -0,0 +1,153 @@ +## 2022-10-10 @pdxjohnny Engineering Logs + +- OS DecentrAlice: dracut fstab +- [Volume 0: Chapter 5: Stream of Consciousness](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md) +- [2022-10-10 IETF SCITT Weekly](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3840337) +- [Dump GitHub Discussion to JSON 2022-10-10T17:58:31+00:00](https://gist.github.com/pdxjohnny/9f3dc18f0a42d3107aaa2363331d8faa) +- https://gist.github.com/pdxjohnny/a0dc3a58b4651dc3761bee65a198a80d#file-run-vm-sh-L174-L200 +- https://gist.github.com/pdxjohnny/b5f757eee43d84b1600dce7896230c37 +- https://github.com/systemd/systemd/issues/16714 +- https://forums.raspberrypi.com/viewtopic.php?p=1632011 +- https://en.wikipedia.org/wiki/Fstab +- KERI + - https://github.com/WebOfTrust/vLEI + - https://github.com/GLEIF-IT/sally + - https://github.com/WebOfTrust/keripy + - https://github.com/WebOfTrust/keripy/blob/development/ref/getting_started.md + - https://github.com/decentralized-identity/keri-dht-py + - https://github.com/orgs/WebOfTrust/projects/2 + - https://github.com/WebOfTrust/keripy/blob/development/ref/getting_started.md#direct-mode +- A Shell for a Ghost + - https://rich.readthedocs.io/en/latest/live.html +- DID Method Registry + - Open Architecture and Alice + - Entrypoints as DIDs for dataflows and overlays, key / id is hash of system context to be executaed with negoation in cached state snapshots embeded into system ocontext (static or data flow seed) + - GraphQL and something like Orie was doing with Cypher for visualization and or use JSON crack first for editing to allow for credential manifest definition and verification for overlays selected to load from network(s), the active lines of communication we have open at any given time even when ephemeral. + - https://github.com/w3c/did-spec-registries/ + - https://github.com/w3c/did-spec-registries/blob/main/tooling/did-method-registry-entry.yml + - https://github.com/pdxjohnny/did-spec-registries/new/open-architecture-and-alice/methods +- References + - https://www.vim.org/download.php + - https://github.com/vim/vim-win32-installer/releases/download/v9.0.0000/gvim_9.0.0000_x86_signed.exe + - https://github.com/graph4ai/graph4nlp + - https://gitlab.com/gitlab-org/gitlab/-/issues/371098 + - https://vulns.xyz/2022/05/auth-tarball-from-git/ + - https://github.com/kpcyrd/rebuilderd + - https://stackoverflow.com/questions/10082517/simplest-tool-to-measure-c-program-cache-hit-miss-and-cpu-time-in-linux/10114325#10114325 + - https://www.nature.com/articles/nature22031 + - > Using numerical simulations and mathematical derivation, we identify how a discrete von Neumann cellular automaton emerges from a continuous Turing reaction–diffusion system. + - Collective Intelligence + +```console +$ ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 sudo rm -f /root/vm/image.qcow2 && scp -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no decentralice.sh $USER@143.110.152.152:./ && export REC_TITLE="Rolling Alice: Engineering Logs: OS DecentrAlice"; export REC_HOSTNAME="build.container.image.nahdig.com"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title "$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}" --command "ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 sudo bash decentralice.sh -kernel /root/vm/kernel -command 'console=ttyS0 systemd.log_level=9'" >(xz --stdout - > "$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz") +``` + +```powershell +PS C:\Users\Johnny> python -m venv .venv.windows +PS C:\Users\Johnny> .\.venv.windows\Scripts\activate +You should consider upgrading via the 'C:\Users\Johnny\.venv.windows\Scripts\python.exe -m pip install --upgrade pip' command. +(.venv.windows) PS C:\Users\Johnny> python -m pip install -U pip setuptools wheel +Requirement already satisfied: pip in c:\users\johnny\.venv.windows\lib\site-packages (21.2.3) +Collecting pip + Using cached pip-22.2.2-py3-none-any.whl (2.0 MB) +Requirement already satisfied: setuptools in c:\users\johnny\.venv.windows\lib\site-packages (57.4.0) +Collecting setuptools + Using cached setuptools-65.4.1-py3-none-any.whl (1.2 MB) +Collecting wheel + Using cached wheel-0.37.1-py2.py3-none-any.whl (35 kB) +Installing collected packages: wheel, setuptools, pip + Attempting uninstall: setuptools + Found existing installation: setuptools 57.4.0 + Uninstalling setuptools-57.4.0: + Successfully uninstalled setuptools-57.4.0 + Attempting uninstall: pip + Found existing installation: pip 21.2.3 + Uninstalling pip-21.2.3: + Successfully uninstalled pip-21.2.3 +Successfully installed pip-22.2.2 setuptools-65.4.1 wheel-0.37.1 +PS C:\Users\Johnny> python -m pip install asciinema +Collecting asciinema + Downloading asciinema-2.2.0-py3-none-any.whl (92 kB) + |████████████████████████████████| 92 kB 202 kB/s +Installing collected packages: asciinema +Successfully installed asciinema-2.2.0 +(.venv.windows) PS C:\Users\Johnny> cd .\Documents\python\dffml\ +(.venv.windows) PS C:\Users\Johnny\Documents\python\dffml> dir + + + Directory: C:\Users\Johnny\Documents\python\dffml + + +Mode LastWriteTime Length Name +---- ------------- ------ ---- +d----- 2/20/2022 3:11 PM .ci +d----- 2/4/2022 9:26 PM .github +d----- 2/20/2022 3:11 PM .vscode +d----- 2/4/2022 9:26 PM configloader +d----- 2/20/2022 3:14 PM dffml +d----- 2/20/2022 3:11 PM dffml.egg-info +d----- 2/4/2022 9:28 PM dist +d----- 2/20/2022 3:14 PM docs +d----- 2/20/2022 3:11 PM examples +d----- 2/4/2022 9:26 PM feature +d----- 2/4/2022 9:26 PM model +d----- 2/20/2022 3:11 PM news +d----- 2/20/2022 3:14 PM operations +d----- 2/20/2022 3:11 PM scripts +d----- 2/4/2022 9:26 PM service +d----- 2/20/2022 3:14 PM source +d----- 2/20/2022 3:14 PM tests +-a---- 2/4/2022 9:26 PM 170 .coveragerc +-a---- 2/4/2022 9:26 PM 260 .deepsource.toml +-a---- 2/4/2022 9:26 PM 42 .dockerignore +-a---- 2/4/2022 9:26 PM 68 .gitattributes +-a---- 2/20/2022 3:11 PM 519 .gitignore +-a---- 2/20/2022 3:11 PM 431 .gitpod.yml +-a---- 2/20/2022 3:11 PM 437 .lgtm.yml +-a---- 2/20/2022 3:11 PM 97 .pre-commit-config.yaml +-a---- 2/4/2022 9:26 PM 79 .pylintrc +-a---- 2/20/2022 3:14 PM 29994 CHANGELOG.md +-a---- 2/4/2022 9:26 PM 112 CONTRIBUTING.md +-a---- 2/20/2022 3:11 PM 3425 Dockerfile +-a---- 2/4/2022 9:26 PM 1088 LICENSE +-a---- 2/4/2022 9:26 PM 68 MANIFEST.in +-a---- 2/20/2022 3:14 PM 480 pyproject.toml +-a---- 2/20/2022 3:14 PM 3002 README.md +-a---- 2/20/2022 3:14 PM 370 requirements-dev.txt +-a---- 2/4/2022 9:26 PM 641 SECURITY.md +-a---- 2/20/2022 3:14 PM 7739 setup.py + + +(.venv.windows) PS C:\Users\Johnny\Documents\python\dffml> git status +Refresh index: 100% (1147/1147), done. +On branch manifest +Your branch is up to date with 'pdxjohnny/manifest'. + +Changes not staged for commit: + (use "git add ..." to update what will be committed) + (use "git restore ..." to discard changes in working directory) + modified: dffml/util/testing/consoletest/commands.py + +no changes added to commit (use "git add" and/or "git commit -a") +(.venv.windows) PS C:\Users\Johnny\Documents\python\dffml> git diff +diff --git a/dffml/util/testing/consoletest/commands.py b/dffml/util/testing/consoletest/commands.py +index 7807c99ff..f83d3fb12 100644 +--- a/dffml/util/testing/consoletest/commands.py ++++ b/dffml/util/testing/consoletest/commands.py +@@ -7,7 +7,6 @@ import sys + import json + import time + import copy +-import fcntl + import shlex + import signal + import atexit +(.venv.windows) PS C:\Users\Johnny\Documents\python\dffml> git log -n 1 +commit 80dc54afb6ee201342ba216fecfaf5ae160686a7 (HEAD -> manifest, pdxjohnny/manifest) +Author: John Andersen +Date: Sat Feb 19 20:35:22 2022 -0800 + + operations: innersource: Fix tests to clone and check for workflows using git operations + + Signed-off-by: John Andersen +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0051/reply_0001.md b/docs/discussions/alice_engineering_comms/0051/reply_0001.md new file mode 100644 index 0000000000..167e0464c1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0051/reply_0001.md @@ -0,0 +1,37 @@ +## 2022-10-10 IETF SCITT Weekly + +- Previous meeting notes: [2022-09-29 IETF SCITT Technical Meeting](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3763647) +- Charter is expected to be finalized by tomorrow + - We had about 4+ weeks of review (which is good, we wanted to have time for people to review) + - Will follow the IETF process more rigorously after initiated (we don't know all of what that entails yet :) + - We will then have IETF tools at our workgroups disposal +- We are currently meeting a lot + - We will sawmp the upcoming meeting schdule this way + - We will have three interums per two weeks if we maintain our current cadence + - We might be overusing the meeting system + - Two tracks + - Weekly Monday + - Fortnightly technical + - working group formal chairs will do this + - Eliot seems unlikley to have bandwidth beyond the BoF +- Upcomming IETF 115 + - Will do sequency diagram hacking + - They will have a remote experiance so that others can feel like they are in Europe at the table via 360 degree camera + - Orie will be there at 1:15 + - Goals + - Ensure we have a through software use case doc +- preliminary agenda: https://datatracker.ietf.org/meeting/115/agenda/ + - https://www.ietf.org/how/runningcode/hackathons/115-hackathon/ + - https://wiki.ietf.org/en/meeting/115/hackathon + - https://datatracker.ietf.org/meeting/115/important-dates/ + - chair logistics - Chairs 10 min  + - starting adoption of first I-D (architecture) - Henk 20 min  + - receipt definition (recap & discussion)k) - Sylvan 15 min  + - COSE merkle tree proofs (options, pros & cons) - Mailk 20 min  + - detailed use case I-D: software supply chain - Orie 25 min +- How do we deal with SPDX no assertion on insert? +- TODO + - [ ] Add self attestations to osftware use case folow chart + - [ ] Ensure we mention how this works with the standard github workflow and sigstore + - [ ] I have vetted this via code review + - [ ] NIST currently only cares about the presence of the SBOM as the attestation (case 0) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0052/index.md b/docs/discussions/alice_engineering_comms/0052/index.md new file mode 100644 index 0000000000..87093a986a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0052/index.md @@ -0,0 +1,8 @@ +# 2022-10-11 Engineering Logs + +- First automated async comms post worked! https://github.com/intel/dffml/actions/workflows/alice_async_comms.yml + - https://docs.github.com/en/actions/creating-actions/metadata-syntax-for-github-actions#branding +- SCITT + - https://github.com/ietf-scitt/scitt-web/blob/main/content/what-is-scitt.md +- Issue Ops + - https://github.com/valet-customers/issue-ops \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0052/reply_0000.md b/docs/discussions/alice_engineering_comms/0052/reply_0000.md new file mode 100644 index 0000000000..79eb5400f6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0052/reply_0000.md @@ -0,0 +1,27 @@ +## 2022-10-11 @pdxjohnny Engineering Logs + +- https://docs.github.com/en/actions/security-guides/automatic-token-authentication +- source data flow as class + - update + - record to mongo doc operation + - overlay/ride for custom (camel case feature keys for example) + - mongo doc upsert operation +- https://mobile.twitter.com/kpcyrd/status/1579617445824040960 + - > I don't think there's anything that can be used as an unlink(2) primitive, the Docker Image Spec has something vaguely similar by special-casing files that start with `.wh.`, putting `RUN touch /etc/.wh.os-release` in your Dockerfile deletes /etc/os-release in the final image. 🥷 +- https://www.civo.com/learn/kubernetes-power-for-virtual-machines-using-kubevirt +- https://github.com/kubevirt/kubevirt +- https://github.com/dffml/dffml-pre-image-removal/commits/shouldi_dep_tree +- https://github.com/chainguard-dev/melange/pull/128/files + - Golang CLI library Cobra has docs generation +- https://github.com/intel/dffml/actions/runs/3228504774/jobs/5284698480 + - Manifest consumption worked + - https://github.com/intel/dffml/commit/0ba6357165cfd69583a7564edf8ec6d77157fcfa + +``` +Error response from daemon: failed to create shim: OCI runtime create failed: runc create failed: unable to start container process: exec: "tail": executable file not found in $PATH: unknown +``` + +[Build: Images: Containers: .github#L1](https://github.com/intel/dffml/commit/74f80dd25577b4047429b00a880f06aaa74829bc#annotation_4889996315) +``` +Error when evaluating 'strategy' for job 'build'. intel/dffml/.github/workflows/build_images_containers.yml@74f80dd25577b4047429b00a880f06aaa74829bc (Line: 64, Col: 19): Error parsing fromJson,intel/dffml/.github/workflows/build_images_containers.yml@74f80dd25577b4047429b00a880f06aaa74829bc (Line: 64, Col: 19): Invalid property identifier character: \. Path '[0]', line 1, position 2.,intel/dffml/.github/workflows/build_images_containers.yml@74f80dd25577b4047429b00a880f06aaa74829bc (Line: 64, Col: 19): Unexpected type of value '', expected type: Sequence. +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0053/index.md b/docs/discussions/alice_engineering_comms/0053/index.md new file mode 100644 index 0000000000..8ec75aaf8a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0053/index.md @@ -0,0 +1 @@ +# 2022-10-12 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0053/reply_0000.md b/docs/discussions/alice_engineering_comms/0053/reply_0000.md new file mode 100644 index 0000000000..ab8c4cf097 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0053/reply_0000.md @@ -0,0 +1,96 @@ +- https://docs.github.com/en/developers/webhooks-and-events/webhooks/webhook-events-and-payloads#push +- https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#push + +```console +$ git log -n 2 +commit b6f9725a5eaa1696904a6b07ded61a27ba5e5b29 (HEAD -> alice, upstream/alice) +Author: john-s-andersen +Date: Wed Oct 12 18:00:57 2022 +0000 + + util: df: internal: Fix for Python 3.9.13 hasattr not detecting NewType.__supertype__ in generator + + Signed-off-by: john-s-andersen + +commit fb5d646e7099f62cb5c34b936d19c1af30c055a7 +Author: John Andersen +Date: Tue Oct 11 17:56:59 2022 -0700 + + docs: tutorials: rolling alice: forward: Add link to John^2 Living Threat Models Are Better Than Dead Threat Models talk +$ gh api https://api.github.com/repos/intel/dffml/compare/fb5d646e7099f62cb5c34b936d19c1af30c055a7...b6f9725a5eaa1696904a6b07ded61a27ba5e5b29 | jq -r '.files[].filename' +dffml/util/df/internal.py +``` + +- Clipped API output + +```json +{ + "files": [ + { + "sha": "55960cf9ea7036a0fcfd68d7799ff1567a876158", + "filename": "dffml/util/df/internal.py", + "status": "modified", + "additions": 4, + "deletions": 1, + "changes": 5, + "blob_url": "https://github.com/intel/dffml/blob/b6f9725a5eaa1696904a6b07ded61a27ba5e5b29/dffml%2Futil%2Fdf%2Finternal.py", + "raw_url": "https://github.com/intel/dffml/raw/b6f9725a5eaa1696904a6b07ded61a27ba5e5b29/dffml%2Futil%2Fdf%2Finternal.py", + "contents_url": "https://api.github.com/repos/intel/dffml/contents/dffml%2Futil%2Fdf%2Finternal.py?ref=b6f9725a5eaa1696904a6b07ded61a27ba5e5b29", + "patch": "@@ -24,6 +24,9 @@ def object_to_operations(obj, module=None):\n obj,\n predicate=lambda i: inspect.ismethod(i)\n or inspect.isfunction(i)\n- and not hasattr(i, \"__supertype__\"),\n+ and not hasattr(i, \"__supertype__\")\n+ # NOTE HACK + Fails in 3.9.13 to remove\n+ # NewType without the check in the str repr.\n+ and \" NewType \" not in str(i),\n )\n ]" + } + ] +} +``` + +```python +import os +import json +import pathlib +import urllib.request + +owner, repository = os.environ["OWNER_REPOSITORY"].split("/", maxsplit=1) + +with urllib.request.urlopen( + urllib.request.Request( + os.environ["COMPARE_URL"], + headers={ + "Authorization": "bearer " + os.environ["GH_ACCESS_TOKEN"], + }, + ) +) as response: + response_json = json.load(response) + +# Build the most recent commit +commit = response_json["commits"][-1]["sha"] + +manifest = list([ + { + "image_name": pathlib.Path(compare_file["filename"]).stem, + "dockerfile": compare_file["filename"], + "owner": owner, + "repository": repository, + "branch": os.environ["BRANCH"], + "commit": commit, + } + for compare_file in response_json["files"] + if compare_file["filename"].startswith(os.environ["PREFIX"]) +]) + +print(json.dumps(manifest, sort_keys=True, indent=4)) +print("::set-output name=matrix::" + json.dumps({"include": manifest})) +``` + +```console +$ PREFIX=dffml GH_ACCESS_TOKEN=$(grep oauth_token < ~/.config/gh/hosts.yml | sed -e 's/ oauth_token: //g') BRANCH=main OWNER_REPOSITORY=intel/dffml COMPARE_URL=https://api.github.com/repos/intel/dffml/compare/a75bef07fd1279f1a36a601d4e652c2b97bfa1de...b6f9725a5eaa1696904a6b07ded61a27ba5e5b29 python test.py +[ + { + "branch": "main", + "commit": "b6f9725a5eaa1696904a6b07ded61a27ba5e5b29", + "dockerfile": "dffml-base.Dockerfile", + "image_name": "dffml-base", + "owner": "intel", + "repository": "dffml" + } +] +::set-output name=matrix::{"include": [{"image_name": "dffml-base", "dockerfile": "dffml-base.Dockerfile", "owner": "intel", "repository": "dffml", "branch": "main", "commit": "b6f9725a5eaa1696904a6b07ded61a27ba5e5b29"}]} +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0053/reply_0001.md b/docs/discussions/alice_engineering_comms/0053/reply_0001.md new file mode 100644 index 0000000000..4668dd25a4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0053/reply_0001.md @@ -0,0 +1,13 @@ +## 2022-10-12 Rolling Alice: Architecting Alice: OS DecentrAlice: Engineering Logs + +```console +$ mkdir -p $(dirname /boot/EFI/BOOT/BOOTX64.EFI) +$ cp boot/efi/EFI/Linux/linux-*.efi /boot/EFI/BOOT/BOOTX64.EFI +``` + +- New approch, fedora cloud `.iso` -> qmeu (`qemu convert .iso .qcow2`) +- `qemu-img resize fedora.qcow2 +10G` +- mess with partition tables to create new partition +- Dump wolfi to it +- Configure systemd to start sshd from wolfi +- John ran out of disk space again \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0054/index.md b/docs/discussions/alice_engineering_comms/0054/index.md new file mode 100644 index 0000000000..b0593cd421 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0054/index.md @@ -0,0 +1,8 @@ +# 2022-10-13 Engineering Logs + +- SCITT + - https://github.com/ietf-scitt/scitt-web/blob/main/content/what-is-scitt.md + - https://medium.com/@nis.jespersen/the-united-nations-trust-graph-d65af7b0b678 + - [2022-10-13 IETF SCITT Technical Meeting](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3871185) +- References + - https://github.com/transmute-industries/jsonld-to-cypher \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0054/reply_0000.md b/docs/discussions/alice_engineering_comms/0054/reply_0000.md new file mode 100644 index 0000000000..8ccaf71bb5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0054/reply_0000.md @@ -0,0 +1,270 @@ +## 2022-10-13 Rolling Alice: Architecting Alice: OS DecentrAlice: Engineering Logs + +- New approch, fedora cloud `.iso` -> qmeu (`qemu convert .iso .qcow2`) +- `qemu-img resize fedora.qcow2 +10G` +- mess with partition tables to create new partition +- Dump wolfi to it +- Configure systemd to start sshd from wolfi +- Configure systemd to start actions runner from wolfi +- Run `alice shouldi contribute` data flows +- sigstore github actions OIDC token + - self-attested (github assisted) scan data + - SCITT OpenSSF Metrics Use Case + - https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md +- Future + - TPM secure boot on the VM +- References + - https://www.qemu.org/docs/master/system/images.html + - https://duckduckgo.com/?q=raw+to+qcow2&ia=web + - https://www.aptgetlife.co.uk/kvm-converting-virtual-disks-from-raw-img-files-to-qcow2/ + - https://alt.fedoraproject.org/cloud/ + - https://download.fedoraproject.org/pub/fedora/linux/releases/36/Cloud/x86_64/images/Fedora-Cloud-Base-36-1.5.x86_64.raw.xz + - Cloud Base compressed raw image + - https://download.fedoraproject.org/pub/fedora/linux/releases/36/Cloud/x86_64/images/Fedora-Cloud-Base-36-1.5.x86_64.qcow2 + - Cloud Base image for Openstack + +```console +$ qemu-img convert -O qcow2 -p Fedora-Cloud-Base-36-1.5.x86_64.raw Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +(0.00/100%) +``` + +```console +$ curl -sfLOC - https://download.fedoraproject.org/pub/fedora/linux/releases/36/Cloud/x86_64/images/Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +$ qemu-img resize Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +10G +$ sudo dnf -y install guestfs-tools libvirt +$ sudo systemctl enable --now libvirtd +$ LIBGUESTFS_BACKEND=direct sudo -E virt-filesystems --long -h --all -a Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +Name Type VFS Label MBR Size Parent +/dev/sda1 filesystem unknown - - 1.0M - +/dev/sda2 filesystem ext4 boot - 966M - +/dev/sda3 filesystem vfat - - 100M - +/dev/sda4 filesystem unknown - - 4.0M - +/dev/sda5 filesystem btrfs fedora - 3.9G - +btrfsvol:/dev/sda5/root filesystem btrfs fedora - - - +btrfsvol:/dev/sda5/home filesystem btrfs fedora - - - +btrfsvol:/dev/sda5/root/var/lib/portables filesystem btrfs fedora - - - +/dev/sda1 partition - - - 1.0M /dev/sda +/dev/sda2 partition - - - 1000M /dev/sda +/dev/sda3 partition - - - 100M /dev/sda +/dev/sda4 partition - - - 4.0M /dev/sda +/dev/sda5 partition - - - 3.9G /dev/sda +/dev/sda device - - - 5.0G - +$ qemu-img resize Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +10G +Image resized. +$ LIBGUESTFS_BACKEND=direct sudo -E virt-filesystems --long -h --all -a Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +Name Type VFS Label MBR Size Parent +/dev/sda1 filesystem unknown - - 1.0M - +/dev/sda2 filesystem ext4 boot - 966M - +/dev/sda3 filesystem vfat - - 100M - +/dev/sda4 filesystem unknown - - 4.0M - +/dev/sda5 filesystem btrfs fedora - 3.9G - +btrfsvol:/dev/sda5/root filesystem btrfs fedora - - - +btrfsvol:/dev/sda5/home filesystem btrfs fedora - - - +btrfsvol:/dev/sda5/root/var/lib/portables filesystem btrfs fedora - - - +/dev/sda1 partition - - - 1.0M /dev/sda +/dev/sda2 partition - - - 1000M /dev/sda +/dev/sda3 partition - - - 100M /dev/sda +/dev/sda4 partition - - - 4.0M /dev/sda +/dev/sda5 partition - - - 3.9G /dev/sda +/dev/sda device - - - 15G - +``` + +```console +$ cp Fedora-Cloud-Base-36-1.5.x86_64.qcow2.bak Fedora-Cloud-Base-36-1.5.x86_64.qcow2 $ truncate -r Fedora-Cloud-Base-36-1.5.x86_64.qcow2 Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2 +$ truncate -s +20GB Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2 $ LIBGUESTFS_BACKEND=direct sudo -E virt-resize --resize /dev/sda5=+1G Fedora-Cloud-Base-36-1.5.x86_64.qcow2 Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2 +[ 0.0] Examining Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +********** + +Summary of changes: + +virt-resize: /dev/sda1: This partition will be left alone. + +virt-resize: /dev/sda2: This partition will be left alone. + +virt-resize: /dev/sda3: This partition will be left alone. + +virt-resize: /dev/sda4: This partition will be left alone. + +virt-resize: /dev/sda5: This partition will be resized from 3.9G to 4.9G. + +virt-resize: There is a surplus of 13.0G. An extra partition will be +created for the surplus. + +********** +[ 7.9] Setting up initial partition table on Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2 +[ 28.5] Copying /dev/sda1 +[ 28.5] Copying /dev/sda2 + 100% ⟦▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒⟧ 00:00 +[ 37.0] Copying /dev/sda3 +[ 37.3] Copying /dev/sda4 +[ 37.4] Copying /dev/sda5 + 100% ⟦▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒⟧ 00:00 + +virt-resize: Resize operation completed with no errors. Before deleting +the old disk, carefully check that the resized disk boots and works +correctly. +``` + +- https://linux.die.net/man/1/virt-resize + +```console +$ curl -sfLOC - https://download.fedoraproject.org/pub/fedora/linux/releases/36/Cloud/x86_64/images/Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +$ qemu-img resize Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +10G +$ sudo dnf -y install guestfs-tools libvirt +$ sudo systemctl enable --now libvirtd +$ qemu-img resize Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +20G +$ cp Fedora-Cloud-Base-36-1.5.x86_64.qcow2.bak Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +$ LIBGUESTFS_BACKEND=direct sudo -E virt-resize --resize /dev/sda5=+1G Fedora-Cloud-Base-36-1.5.x86_64.qcow2 Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2 +$ qemu-system-x86_64 -no-reboot -smp cpus=2 -m 4096M -enable-kvm -nographic -cpu host -drive file=/home/pdxjohnny/Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2,if=v2 +SeaBIOS (version 1.16.0-1.fc36) + + +iPXE (https://ipxe.org) 00:03.0 CA00 PCI2.10 PnP PMM+BFF8C110+BFECC110 CA00 + + + +Booting from Hard Disk... +GRUB loading. +Welcome to GRUB! + + GNU GRUB version 2.06 + + ┌────────────────────────────────────────────────────────────────────────────┐ + │*Fedora Linux (5.17.5-300.fc36.x86_64) 36 (Cloud Edition) │ +``` + +- Still seeing issues with bad superblocks +- https://gist.github.com/pdxjohnny/6063d1893c292d1ac0024fb14d1e627d + +``` +e2fsck: Bad magic number in super-block while trying to open /dev/nbd1p5 +/dev/nbd1p5: +The superblock could not be read or does not describe a valid ext2/ext3/ext4 +filesystem. If the device is valid and it really contains an ext2/ext3/ext4 +filesystem (and not swap or ufs or something else), then the superblock +is corrupt, and you might try running e2fsck with an alternate superblock: + e2fsck -b 8193 + or + e2fsck -b 32768 + +``` + +- New new approach, packer: https://www.packer.io/downloads + - https://www.packer.io/plugins/builders/openstack + - https://www.packer.io/plugins/builders/digitalocean + - https://www.packer.io/plugins/builders/qemu + - https://www.packer.io/plugins/datasources/git/commit + - Manifest + - https://www.packer.io/plugins/builders/digitalocean#user_data + - https://gist.github.com/pdxjohnny/a0dc3a58b4651dc3761bee65a198a80d#file-run-vm-sh-L156-L205 + - Enable github actions on boot via systemd here +- https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-container-registry +- https://gist.github.com/nickjj/d63d1e0ee71f4226ac5000bf1022bb38 +- https://gist.github.com/pdxjohnny/5f358e749181fac74a750a3d00a74b9e + +**osdecentralice.json** + +```json +{ + "variables": { + "version": "latest", + "do_token": "{{env `DIGITALOCEAN_TOKEN`}}" + }, + "builders": [ + { + "type": "digitalocean", + "api_token": "{{user `do_token`}}", + "image": "fedora-36-x64", + "region": "sfo3", + "size": "m3-2vcpu-16gb", + "ssh_username": "root", + "droplet_name": "osdecentralice-{{user `version`}}", + "snapshot_name": "osdecentralice-{{user `version`}}-{{timestamp}}" + } + ], + "provisioners": [ + { + "type": "shell", + "inline": [ + "set -x", + "set -e", + "dnf upgrade -y", + "dnf install -y podman", + "curl -sfLC - -o Dockerfile https://gist.github.com/pdxjohnny/5f358e749181fac74a750a3d00a74b9e/raw/f93d3831f94f58751d85f71e8e266f6020042323/Dockerfile", + "sha256sum -c -<<<'b5f31acb1ca47c55429cc173e08820af4a19a32685c5e6c2b1459249c517cbb5 Dockerfile'", + "podman build -t osdecentralice:latest - < Dockerfile", + "container=$(podman run --rm -d --entrypoint tail osdecentralice -F /dev/null);", + "trap \"podman kill ${container}\" EXIT", + "sleep 1", + "podman cp \"${container}:/\" /wolfi" + ] + } + ] +} +``` + +```console +$ sudo -E packer build osdecentralice.json +``` + +![image](https://user-images.githubusercontent.com/5950433/195759634-4493d348-fb66-41ba-a531-330e7e5662c7.png) + +```console + digitalocean: --> 7b72b288ae3 + digitalocean: [2/2] STEP 8/8: ENTRYPOINT bash + digitalocean: [2/2] COMMIT osdecentralice:latest + digitalocean: --> 919ae809e98 + digitalocean: Successfully tagged localhost/osdecentralice:latest + digitalocean: 919ae809e9841893f046cd49950c4515b04bb24db5d87f1de52168275860ebec +==> digitalocean: ++ podman run --rm -d --entrypoint tail osdecentralice -F /dev/null +==> digitalocean: + container=0c0d3ad9125c981aff17b78ee38c539229b444e546a4e346bc1f86d7ca0480fb +==> digitalocean: + trap 'podman kill 0c0d3ad9125c981aff17b78ee38c539229b444e546a4e346bc1f86d7ca0480fb' EXIT +==> digitalocean: + sleep 1 +==> digitalocean: + podman cp 0c0d3ad9125c981aff17b78ee38c539229b444e546a4e346bc1f86d7ca0480fb:/ /wolfi +==> digitalocean: + podman kill 0c0d3ad9125c981aff17b78ee38c539229b444e546a4e346bc1f86d7ca0480fb + digitalocean: 0c0d3ad9125c981aff17b78ee38c539229b444e546a4e346bc1f86d7ca0480fb +==> digitalocean: Gracefully shutting down droplet... +==> digitalocean: Creating snapshot: osdecentralice-latest-1665722921 +==> digitalocean: Waiting for snapshot to complete... +==> digitalocean: Destroying droplet... +==> digitalocean: Deleting temporary ssh key... +Build 'digitalocean' finished after 10 minutes 12 seconds. + +==> Wait completed after 10 minutes 12 seconds + +==> Builds finished. The artifacts of successful builds are: +--> digitalocean: A snapshot was created: 'osdecentralice-latest-1665722921' (ID: 118836442) in regions 'sfo3' +++ history -a +pdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 ~ $ +``` + +![image](https://user-images.githubusercontent.com/5950433/195765976-fe432d96-b2ca-4a10-a595-b82acaf0f463.png) + +- Now to install github actions runner in wolfi, and configure systemd to auto start it. + - Ideally we figure out how to deploy a bunch of these, terraform? + - They need to be ephemeral and shutdown after each job + - Treat vector: Comprimise by threat actor results in system not triggering shutdown. + - Mitigation: Reap out of band + +![image](https://user-images.githubusercontent.com/5950433/195766172-7898c5ce-de9a-48cc-a2d4-331a7e614dd3.png) + +```console +[root@osdecentralice-latest-1665722921-s-4vcpu-8gb-sfo3-01 ~]# chroot /wolfi /usr/bin/python +Python 3.10.7 (main, Jan 1 1970, 00:00:00) [GCC 12.2.0] on linux +Type "help", "copyright", "credits" or "license" for more information. +>>> import pathlib +>>> print(pathlib.Path("/etc/os-release").read_text()) +ID=wolfi +NAME="Wolfi" +PRETTY_NAME="Wolfi" +VERSION_ID="20220913" +HOME_URL="https://wolfi.dev" + +>>> +``` + +[![asciicast](https://asciinema.org/a/528221.svg)](https://asciinema.org/a/528221) + +[![asciicast](https://asciinema.org/a/528220.svg)](https://asciinema.org/a/528220) + +[![asciicast](https://asciinema.org/a/528223.svg)](https://asciinema.org/a/528223) diff --git a/docs/discussions/alice_engineering_comms/0054/reply_0001.md b/docs/discussions/alice_engineering_comms/0054/reply_0001.md new file mode 100644 index 0000000000..783ec5bbce --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0054/reply_0001.md @@ -0,0 +1,23 @@ +## 2022-10-13 IETF SCITT Technical Meeting + +- WG Chartered! + - https://mailarchive.ietf.org/arch/msg/scitt/OsUTPGEUUVQGxcU1J8UostNs1iM/ + - https://datatracker.ietf.org/doc/charter-ietf-scitt/ + - https://vocabulary.transmute.industries/ +- Semantic Versioning + - Ray would like to see this included in software use case. + - Policy around update + - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice +- Facilitate post instance creation labeling + - Notary adds to transparency infrastructure at a later point, how do we ensure others have access to that? + - They should go query those notaries or require up to date receipts from them. +- We don't care so much about what's in the SBOM, it's just data +- There may be many SBOMs for a single release of software, they could be insert by multiple notaries using different scanner implementations. +- Trust graphs constricuted at a later date + - Orie Steele (Transmute): + - 'In our world, these are “graph queries”... the graphs are built from the registry data. joined with other data. I don't see SCITT as solving for graph queries… it just provides a data set that is projected into the graph' +- Can't we just always use a recpit to auth? + +Source: https://github.com/ietf-scitt/scitt-web/blob/main/content/what-is-scitt.md + +![scii-persistance](https://github.com/ietf-scitt/scitt-web/raw/main/content/media/scitt-persistence.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0054/reply_0002.md b/docs/discussions/alice_engineering_comms/0054/reply_0002.md new file mode 100644 index 0000000000..ddcd9df081 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0054/reply_0002.md @@ -0,0 +1,18 @@ +## 2022-10-13 @pdxjohnny Engineering Logs + +- https://github.com/actions/runner/compare/main...fgalind1:runner:k8s-support +- https://github.com/uor-community/ai-model-registry + - https://gist.github.com/usrbinkat/761d8f2f4da018d861451aff45b2cde7 + - https://universalreference.io/docs/intro + - This is aligned + - > Why would you want to link something like web pages or any content via attributes? +This might seem arbitrary at first glance, but it is a fundamental concept in human cognition. We describe a table to another person via its attributes i.e. Dark wood, 18x2in rectangular legs, round top... If we’ve been precise enough in our description, another person would be able to pick that table out of a showroom of tables. UOR takes this concept and applies it to everything. We can then train AI models on a uniformly formatted internet containing contextually linked data. + - https://www.mdpi.com/2504-2289/5/4/56/htm + - > With the rapid development of 5G communications, enhanced mobile broadband, massive machine type communications and ultra-reliable low latency communications are widely supported. However, a 5G communication system is still based on Shannon’s information theory, while the meaning and value of information itself are not taken into account in the process of transmission. Therefore, it is difficult to meet the requirements of intelligence, customization, and value transmission of 6G networks. In order to solve the above challenges, we propose a 6G mailbox theory, namely a cognitive information carrier to enable distributed algorithm embedding for intelligence networking. Based on Mailbox, a 6G network will form an intelligent agent with self-organization, self-learning, self-adaptation, and continuous evolution capabilities. With the intelligent agent, redundant transmission of data can be reduced while the value transmission of information can be improved. Then, the features of mailbox principle are introduced, including polarity, traceability, dynamics, convergence, figurability, and dependence. Furthermore, key technologies with which value transmission of information can be realized are introduced, including knowledge graph, distributed learning, and blockchain. Finally, we establish a cognitive communication system assisted by deep learning. The experimental results show that, compared with a traditional communication system, our communication system performs less data transmission quantity and error. +- https://github.com/chainguard-dev/apko + - container build pipelines but with manifests for apko +- TODO + - [ ] https://universalreference.io/docs/Quick%20Start/intro#publishing-a-collection + - Related: #1207 + - https://github.com/uor-framework/uor-client-go#build-a-schema-into-an-artifact + - Possibly build schema for inputs to containers as manifests emebedded / mapped to CLI or config format? \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0055/index.md b/docs/discussions/alice_engineering_comms/0055/index.md new file mode 100644 index 0000000000..d9152e1d4d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0055/index.md @@ -0,0 +1 @@ +# 2022-10-14 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0055/reply_0000.md b/docs/discussions/alice_engineering_comms/0055/reply_0000.md new file mode 100644 index 0000000000..bdc0e9134e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0055/reply_0000.md @@ -0,0 +1,14 @@ +## 2022-10-14 @pdxjohnny Engineering Logs + +- Alice helps you understand what your software is EATing, what’s the health of its software supply chain (food as the biological supply chain). You are what you EAT and your software is its development health! You get out what you put in lifecycle wise. +- https://github.com/ossf/scorecard/blob/main/docs/checks.md +- https://gist.github.com/pdxjohnny/f56e73b82c1ea24e1e7d6b995a566984 +- https://github.com/sigstore/gitsign#environment-variables + - > Env var | | | | + > -- | -- | -- | -- + > GITSIGN_FULCIO_URL | ✅ | https://fulcio.sigstore.dev | Address of Fulcio server + > GITSIGN_LOG | ❌ |   | Path to log status output. Helpful for debugging when no TTY is available in the environment. + > GITSIGN_OIDC_CLIENT_ID | ✅ | sigstore | OIDC client ID for application + > GITSIGN_OIDC_ISSUER | ✅ | https://oauth2.sigstore.dev/auth | OIDC provider to be used to issue ID token + > GITSIGN_OIDC_REDIRECT_URL | ✅ |   | OIDC Redirect URL + > GITSIGN_REKOR_URL | ✅ | https://rekor.sigstore.dev | Address of Rekor server \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0056/index.md b/docs/discussions/alice_engineering_comms/0056/index.md new file mode 100644 index 0000000000..24a5dcc2ef --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0056/index.md @@ -0,0 +1,40 @@ +# 2022-10-15 Engineering Logs + +- http://blockexplorer.graft.network/ +- Async Comms + - Examples + - At 07:34 -7 UTC @pdxjohnny started drafting the tutorial: `Rolling Alice: Coach Alice: You are what you EAT!` + - Others with the GitHub discussions thread loaded in their browser (at least on desktop) will see updates soon after he edits comments and replies in the thread. + - Possible aligned tutorial sketch follows: `Rolling Alice: Architecting Alice: Thought Communication Protocol Case Study: DFFML` + - We will combine GitHub Actions on discussion edit trigger with [`scripts/dump_discussion.py`](https://github.com/intel/dffml/blob/ed4d806cf2988793745905578a0adc1b02e7eeb6/scripts/dump_discussion.py) + - We will replicate this data to DIDs and run DWN `serviceEndpoint` s as needed. + - system context as service endpoint or executed locally if sandboxing / orchestrator policy permits. + - See early architecting Alice Engineering Log lossy cached streams of consciousness for more detail + - https://www.youtube.com/playlist?list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK + - We will attest data using reusable workflows, OIDC, and sigstore + - We will run more rekor / fulcio instances + - We will network via webrtc and DERP + - We will write orchestration operations / data flows / overlays and use data flow as class to leverage them via double context entry pattern (or some other way to do that). + - We will see the same effect, but in a more DID based way with abstract implementation / infra + - This will be mentioned as being a follow on to the tutorial: `Rolling Alice: Architecting Alice: Stream of Consciousness` + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md + - Alice will filter by updates relevant to the downstream receiver of events based on their current state, context, etc. + - https://twitter.com/SergioRocks/status/1580545209678454784 + - > !["Because Jade had more uninterrupted Deep Work time than Brayan. Those 4 interruptions that Brayan suffered amounted for an actual loss of 3 hours of productive work on the tasks assigned to him." Sergio Pereira](https://pbs.twimg.com/media/Fe85fdaXgAEhe4_?format=png) + - She will notify or etc. as appropriate based off prioritizer's thoughts on + - **TODO** implement the prioritizer concept as another tutorial + - Similar to "Bob Online" or "Alice Online" message from webhook based tutorial but ran through data flow / overlayed logic to determine relevance and what to do / say. Also it's now including Decentralized Web Nodes and DIDs. Possible next step / future in this (aligned clusters) train of thought would be: + - KERI encapsulation over arbitrary channels + - NLP to summarize git log changes + - Hook up to git log + - CI integration to serialize to sensible information format + - Eventually Alice will be able to tell us whatever we want to know. + - In the future (current date 2022-10-15), when you want to know something + about Alice, she'll be able to tell you, because she knows about her + own codebase, and she has solid foundations for security and trust and + alignment with your strategic principles / values. She's a trustworthy + messenger, the Ghost in the shell. + - See discussion thread (or the thread dump in `docs/arch/alice/discussion`) + - https://github.com/intel/dffml/tree/alice/docs/arch/alice/discussion + - `$ git log -p --reverse -p -- docs/arch/alice/discussion` + - https://github.com/intel/dffml/discussions/1369 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0056/reply_0000.md b/docs/discussions/alice_engineering_comms/0056/reply_0000.md new file mode 100644 index 0000000000..10a6ce6e89 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0056/reply_0000.md @@ -0,0 +1,30 @@ +# Rolling Alice: Coach Alice: You are what you EAT! + +Alice helps you understand what your software is EATing, what's +the health of its software supply chain (food as the biological supply +chain). You are what you EAT and your software is its development health! +You get out what you put in lifecycle wise. + +Alice is our software developer coach. She helps us help ourselves. +If Alice was coaching us on being healthier person, she would tell +us to look at our digestion! When building software our measuring the +health of our digestion is aligned with measuring our progress towards +reaching critical velocity. + +In this tutorial we'll follow on to the Down the Dependency Rabbit Hole +Again tutorial and get more into seeing the lifecycle of the project +and it's health as critical in the security of the project. We'll +treat the health of the lifecycle as an asset to be protected in our +threat model `alice threats` / `THREATS.md`. + +- References + - https://github.com/johnlwhiteman/living-threat-models + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md + - https://cloud.google.com/blog/products/devops-sre/dora-2022-accelerate-state-of-devops-report-now-out + - DORA metrics + - Culture + - happiness == good + - **TODO** find link about happiness in article based of 2022 dora report results + - https://www.gutenberg.org/files/11/11-h/11-h.htm + - https://colab.research.google.com/drive/1gol0M611zXP6Zpggfri-fG8JDdpMEpsI + - Trying to generate images for this tutorial using the public domain images from the original Alice's Adventures in Wonderland as overlays (img2img)... \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0056/reply_0001.md b/docs/discussions/alice_engineering_comms/0056/reply_0001.md new file mode 100644 index 0000000000..8b9329f437 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0056/reply_0001.md @@ -0,0 +1,14 @@ +## 2022-10-15 @pdxjohnny Engineering Logs + +- Wolfi + - https://edu.chainguard.dev/open-source/apko/overview/ +- Packer + - https://www.packer.io/docs/post-processors/manifest +- https://github.com/intel/dffml/issues/1334 +- Vol 6: Happy happy joy joy + - Positive thinking + - Document one up and one down +- Vol 6: intro: Then it’s a wonderful dream + - Sequence similar to Peace at Last + - Alice: “Maybe it’s a dream?” + - “Then it’s a wonderful dream” \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0057/index.md b/docs/discussions/alice_engineering_comms/0057/index.md new file mode 100644 index 0000000000..9e75aed2b2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0057/index.md @@ -0,0 +1 @@ +# 2022-10-16 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0057/reply_0000.md b/docs/discussions/alice_engineering_comms/0057/reply_0000.md new file mode 100644 index 0000000000..afda47fab3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0057/reply_0000.md @@ -0,0 +1,33 @@ +- stable diffusion + - https://github.com/divamgupta/stable-diffusion-tensorflow/pull/50/files +- reinforcement learning + - https://arxiv.org/abs/1903.00714 + - RL for supply chain + - https://github.com/facebookresearch/mvfst-rl + - > mvfst-rl is a framework for network congestion control in the QUIC transport protocol that leverages state-of-the-art in asynchronous Reinforcement Learning training with off-policy correction. +- GitHub Actions + - https://github.com/GoogleContainerTools/kaniko#running-kaniko-in-docker + - See if updating the `build_images_containers.yml` works if we add these volume mounts and so forth. + - There may have been an OCI image issue. Maybe we can rebuild and push in docker format? + - Lets just switch to podman or docker onstead of kaniko because we know that works on actions +- Container Registry + - Provide on demand image builds where final layers are just added staticlly + - https://github.com/ImJasonH/kontain.me + - https://github.com/google/go-containerregistry/blob/a0f66878d01286cac42d99fb45e3b335710c00a5/pkg/v1/random/image.go + - These layers then have their SBOM added where they have provenance as the data provenance for the addition of the layer + - Then we have content addressability and SBOM and provenance from sigstore etc. via existing registry interoperability tooling + - Compute contracts can be issued by having the pull from the registry be authed by verifiable credential + - Registry releases content addressable verifiable with SCITT recpit of release (because data might be sensitive, need confirmed release in case of need to revoke / roll keys) +- Created DigitalOcean space data.nahdig.com + - data.nahdig.com is for data with suspect provenance + - No `.` in any names in DO spaces! Certs will fail! + - We have taken no steps to think about hardening on OS DecentrAlice yet within context of scanning + - We should assume VM compromise, aka, data is best effort + - Hence nahdig + - Data from systems with provenance and hardening will be served from data.chadig.com + - https://nahdig.sfo3.cdn.digitaloceanspaces.com/ + - https://nahdig.sfo3.digitaloceanspaces.com/ + - https://data.nahdig.com/ + - `contribute.shouldi.alice.data.nahdig.com` + +![create-digitalocean-space-data.nahdig.com](https://user-images.githubusercontent.com/5950433/196057425-a8b74ec5-9c24-42d3-8693-373a61be5d13.png) diff --git a/docs/discussions/alice_engineering_comms/0058/index.md b/docs/discussions/alice_engineering_comms/0058/index.md new file mode 100644 index 0000000000..ae9e42b269 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0058/index.md @@ -0,0 +1,20 @@ +# 2022-10-17 Engineering Logs + +- https://github.com/m00sey/canis +- https://github.com/ioflo +- https://github.com/decentralized-identity/keri/blob/master/kids/kid0003.md +- https://github.com/build-trust/ockam + - > trust for data + - https://github.com/build-trust/ockam/tree/develop/documentation/use-cases/end-to-end-encrypt-all-application-layer-communication#readme +- https://github.com/WebOfTrust/keri-dht-py + - ~~Try spinning this up~~ outdated + - https://github.com/WebOfTrust/keri + - Process side note: We could communicate with Alice by having her post a discussion comment reply and then edit it to include instructions, she then fills reply with work / (sub) list items with her summary of progress/ results +- https://github.com/ioflo/hio +- TODO + - [ ] Docker and ghcr builds and packer do build + - [ ] Infra DO automation as operations executed in preapply? Of k8s job orchestrator + - [ ] Deploy k3s by default in vm os image + - [ ] Run actions runner controller on VMs + - [ ] Run scan from github actions self hosted DO backed + - [ ] Crawler to find repos \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0058/reply_0000.md b/docs/discussions/alice_engineering_comms/0058/reply_0000.md new file mode 100644 index 0000000000..f2582a3aec --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0058/reply_0000.md @@ -0,0 +1,150 @@ +## 2022-10-17 @pdxjohnny Engineering Logs + +- https://w3c.github.io/dpv/dpv/ +- https://github.com/GLEIF-IT/sally +- https://github.com/comunica/comunica/tree/master/engines/query-sparql#readme + - https://www.w3.org/TR/sparql11-update/ + - Could be used during tbDEX negotiation of compute contract +- https://ruben.verborgh.org/blog/2018/12/28/designing-a-linked-data-developer-experience/ + - https://comunica.github.io/Article-ISWC2018-Demo-GraphQlLD/ + - https://comunica.github.io/Article-ISWC2018-Resource/ + - > Local and remote dataset dumps in RDF serializations + - https://ontola.io/blog/rdf-serialization-formats/#tldr + - https://comunica.dev/research/link_traversal/ + - https://comunica.github.io/comunica-feature-link-traversal-web-clients/builds/solid-prov-sources/#transientDatasources=https%3A%2F%2Fwww.rubensworks.net%2F + - could post cached serializations to github pages to uodate as CMS + - Could extend to execute data flows on resolution (hiting and endpoint) + - Need to figure out how to serialize, will analyze data from demos to look for patterns in links and resolvable URLS + - Will try to use localhost run and python builtin http.server to query data + - Stand up query server if nessicary + - Wget mirror to cache everything or something like that + - Then need to figure out sigstore / rekor provenance + - http://videolectures.net/iswc2014_verborgh_querying_datasets/ + - https://github.com/rdfjs/comunica-browser + - https://github.com/LinkedDataFragments/Server.js/blob/6bdb7f4af0af003213c4765065961ca77594aa63/packages/datasource-sparql/lib/datasources/SparqlDatasource.js#L31-L76 +- Cloud Development Environments + - https://github.com/coder/coder/tree/main/examples/templates/do-linux + - https://github.com/nestybox/sysbox + - https://coder.com/docs/coder-oss/latest/templates/change-management + - https://coder.com/docs/coder-oss/latest/secrets#dynamic-secrets + - > Dynamic secrets are attached to the workspace lifecycle and automatically injected into the workspace. With a little bit of up front template work, they make life simpler for both the end user and the security team. This method is limited to [services with Terraform providers](https://registry.terraform.io/browse/providers), which excludes obscure API providers. + - https://coder.com/docs/coder-oss/latest/admin/automation + - Example uses https://registry.terraform.io/providers/RJPearson94/twilio/latest/docs/resources/iam_api_key + - https://github.com/RJPearson94/terraform-provider-twilio/blob/07460ebdef45d59a52eef13f8bdb9ff0a7219c83/twilio/provider.go#L46 + - > `Sensitive: true,` + - https://github.com/RJPearson94/terraform-provider-twilio/blob/61b96f0beb6e5827037ddf2db7b160b52df7c666/examples/credentials/aws/outputs.tf + - https://github.com/hashicorp/terraform-provider-external/blob/1aff6be074b053de5cc86ca3dc5cac122e8cedcd/internal/provider/test-programs/tf-acc-external-data-source/main.go#L34-L37 + - https://www.terraform.io/language/functions/sensitive + - https://coder.com/docs/coder-oss/latest/dotfiles + - https://coder.com/docs/coder-oss/latest/templates#parameters + - https://registry.terraform.io/providers/hashicorp/external/latest/docs/data-sources/data_source + - Store secrets in GitHub + - Run workflow + - Network with DERP + - Start callback endpoint on port 0 for random port (`dffml-service-http`) + - https://pkg.go.dev/tailscale.com/derp + - > Package derp implements the Designated Encrypted Relay for Packets (DERP) protocol. DERP routes packets to clients using *curve25519* keys as addresses. DERP is used by Tailscale nodes to proxy encrypted WireGuard packets through the Tailscale cloud servers when a direct path cannot be found or opened. DERP is a last resort. Both sides between very aggressive NATs, firewalls, no IPv6, etc? Well, DERP. + - Send back secrets and OIDC token to callback endpoint using public key provided as input (TODO KERI) +- Web UI Testing + - https://github.com/mobile-dev-inc/maestro +- DID + - https://github.com/orgs/w3c/repositories?language=&q=did&sort=&type=all + - https://w3c.github.io/did-imp-guide/ + - https://github.com/w3c/did-spec-registries/compare/main...pdxjohnny:did-spec-registries:open-architecture-and-alice + - Need to understand if this is appropriate + - Goal: Define how DID operations could be used to execute the content addressable contracts + - See kontian.me references and notes towards bottom of today's engineering logs + - `did:alice:sha256:01` + - https://identity.foundation/keri/did_methods/ + - https://w3c.github.io/did-rubric/ + +### DID Method Registration + +As a DID method registrant, I have ensured that my DID method registration complies with the following statements: + +- [ ] The DID Method specification [defines the DID Method Syntax](https://w3c.github.io/did-core/#method-syntax). +- [ ] The DID Method specification [defines the Create, Read, Update, and Deactivate DID Method Operations](https://w3c.github.io/did-core/#method-operations). +- [ ] The DID Method specification [contains a Security Considerations section](https://w3c.github.io/did-core/#security-requirements). +- [ ] The DID Method specification [contains a Privacy Considerations section](https://w3c.github.io/did-core/#privacy-requirements). +- [ ] The JSON file I am submitting has [passed all automated validation tests below](#partial-pull-merging). +- [x] The JSON file contains a `contactEmail` address [OPTIONAL]. +- [x] The JSON file contains a `verifiableDataRegistry` entry [OPTIONAL]. + - There will be a registry but primarily this our goal is to enable sandboxed distributed compute + +--- + +- DFFML + - Write operations, use octx.ictx directly: + - memory_input_network_input_context_to_dict + - dict_to_json + - dict_to_did_serialized + - Takes Credential Manifest (and wallet ref?) + - memory_ memory_ memory_input_network_input_context_merge_from_dict + - dict_from_json + - dict_to_did_serialized + - Takes Credential Manifest? Or JSON-LD / graphql-ld or maybe just data flow to validate verifiable credentials needed are present (and wallet ref?) + - https://w3c.github.io/did-rubric/ + - memory_input_network_serve_strawberry_graphql + - graphql_query + - watch_for_compute_contracts + - Watch stream of consciousness for new compute contracts read / verify via container image on demand registry + - Eventually overlay for input network and associated operations to keep more performant series snapshot data. `List[memory_input_network_input_context_to_dict.outputs.result]` for each change to the input network. Enables rollback to any point as cached state or modification throughout. +- Kubernetes + - https://k3s.io/ + - https://github.com/k3s-io/k3s/releases/tag/v1.25.2%2Bk3s1 + - Add to OS DecentrAlice +- apko + - https://github.com/chainguard-dev/apko/tree/main/examples +- KCP + - https://github.com/kcp-dev/kcp + - > kcp is a Kubernetes-like control plane focusing on: A control plane for many independent, isolated "clusters" known as workspaces + - Great, this could satisfy our workspace manager component requirement + within the abstract compute architecture. + - Add to OS DecentrAlice + - Need to figure out how to DWN network on boot and establish webrtc channels + (or other channels). + - Need to figure out how to automate and make cluster config / discovery dynamic + and transparent on each running user instance of OS DecentrAlice. + - Enable two use cases + - Automated deployment, autostart on boot systemd config UNIX socket for kcp + - End user on system, autostart on boot user login systemd config UNIX socket for kcp + +```mermaid +graph TD + subgraph abstract_compute_architecture[Abstract Compute Architecture] + derp[DERP Server] + subgraph devenv[Developer Environment] + editor[Editor] + terminal[Terminal] + browser[Browser] + end + workspace_management[Workspace Management] + iasc[Infrastructure as Code] + osdecentralice[OS DecentrAlice] + + editor --> |http2| derp + terminal --> |http2| derp + browser --> |http2| derp + + derp --> workspace_management + workspace_management --> iasc + + iasc --> kcp + kcp --> k3s + k3s --> osdecentralice + + derp --> osdecentralice + end +``` + + - https://github.com/kcp-dev/kcp/blob/main/docs/concepts.md + - https://github.com/kcp-dev/kcp/blob/main/docs/virtual-workspaces.md + - https://github.com/kcp-dev/kcp/blob/main/docs/content/en/main/concepts/workspaces.md + - > Multi-tenancy is implemented through workspaces. A workspace is a Kubernetes-cluster-like HTTPS endpoint, i.e. an endpoint usual Kubernetes client tooling (client-go, controller-runtime and others) and user interfaces (kubectl, helm, web console, ...) can talk to like to a Kubernetes cluster. +- Downstream validation / stream of consciousness tutorial part + - Automating an entities post to the daily engineering logs + - Via receipt of downstream event and trigger of graphql comment + reply addition to thread. +- TODO + - [ ] SECURITY Check KCP hard/soft multi-tenancy threat model info + or ascertain if not present. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0059/index.md b/docs/discussions/alice_engineering_comms/0059/index.md new file mode 100644 index 0000000000..9c5a4ef8cd --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0059/index.md @@ -0,0 +1 @@ +# 2022-10-18 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0059/reply_0000.md b/docs/discussions/alice_engineering_comms/0059/reply_0000.md new file mode 100644 index 0000000000..4f37190827 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0059/reply_0000.md @@ -0,0 +1,437 @@ +## 2022-10-18 @pdxjohnny Engineering Logs + +- https://github.com/OR13/didme.me + - Goal: Connect this to our content addressable (container images) compute contract stuff + - Seeing ipfs project id errors on did resolution in deployed demo + - Cloning to see what's up... + - https://classic.yarnpkg.com/en/docs/install#centos-stable + - https://github.com/transmute-industries/verifiable-actions + - https://lucid.did.cards/identifiers/did:key:z6MkrJx9cCCpu7D1Scy7QovGeWShHzfSPHJXxNq5TwbZzkRF + - https://api.did.actor/v/eJylkluTmjAAhf9L9lVuQRHy1PVa1hV1FW-dTidA0CgCkiDqjv-9Qbfu5a3tDC-QMyfnfIdX8M1PYk6OHKAfYM15ypCiFEUhF7qcZCsFqpqp-BkJSMwpjphy0EDlXajT4Co7-FJGDomPOU1iKaKMS1CFaqn-WQE0AAjkWYzynAZIgx7ENQNKej2sSlWz6kkWVLFkaoFp-pDU61gXd_BTSspQU5LRkGIvIs17jKspYznJhHEgPLfkhM5Gf8vp-LzvWNt9EbjmdBs0esea0V6c52G6ip2h-9g9xyn1HToLY3DzwLFPWpiLu4Aoq0qqJp6JZiGoI1hdCtV7_THHPGcAvd4q_cGAUyqLFDL2eZIpX0AwRZM3LIkf1Hsp8HKXPAtFSerNuQKyT0d2HJAjQOrX7x9Q_GUMcPlUKPc2xOf3RiVLcsS7NCJiJ70Up1mShKXgLXs7gLWaZo3pKhaRM1L-ITdIAmJwpQihJEBq5gRCpBqoapYUD9cdb4n6hK-T4D-2e_iHsa9FhnmWJqzsgRkj2YcwFbApxLSAnJ7WXtenA_rUWbZfJqOxzeydDZ2mbSx3HeZDV7w7Jzwf0UHE6GKzUO1Is2R519nXnPHam-xC95dUJaQlhaee1js0u43mqWX2oNtsuI5r9fXFLHkeH767Z1Lf14c5nLoR59Cw54O6ZzT0Ge8VZDRtF_PHEbhcfgMlFDfZ + - https://stackoverflow.com/questions/69692842/error-message-error0308010cdigital-envelope-routinesunsupported + +[![use-the-source](https://img.shields.io/badge/use%20the-source-blueviolet)](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md#use-the-source-) + +```console +$ git clone https://github.com/OR13/didme.me +$ yarn install +$ yarn start +failure... +$ npx next +ready - started server on 0.0.0.0:3000, url: http://localhost:3000 +info - Using webpack 4 in Next.js is deprecated. Please upgrade to using webpack 5: https://nextjs.org/docs/messages/webpack5 +warn - You have enabled experimental feature(s). +warn - Experimental features are not covered by semver, and may cause unexpected or broken application behavior. Use them at your own risk. + +node:internal/crypto/hash:71 + this[kHandle] = new _Hash(algorithm, xofLen); + ^ + +Error: error:0308010C:digital envelope routines::unsupported + at new Hash (node:internal/crypto/hash:71:19) + at Object.createHash (node:crypto:133:10) + at module.exports.__webpack_modules__.18768.module.exports (/home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:78057:62) + at NormalModule._initBuildHash (/home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:51469:16) + at handleParseError (/home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:51523:10) + at /home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:51555:5 + at /home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:51410:12 + at /home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:20871:3 + at iterateNormalLoaders (/home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:20712:10) + at Array. (/home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:20703:4) { + opensslErrorStack: [ 'error:03000086:digital envelope routines::initialization error' ], + library: 'digital envelope routines', + reason: 'unsupported', + code: 'ERR_OSSL_EVP_UNSUPPORTED' +} + +Node.js v18.11.0 +pdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 didme.me $ npx next --help^C +pdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 didme.me $ +pdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 didme.me $ NODE_OPTIONS=--openssl-legacy-provider npx next +ready - started server on 0.0.0.0:3000, url: http://localhost:3000 +info - Using webpack 4 in Next.js is deprecated. Please upgrade to using webpack 5: https://nextjs.org/docs/messages/webpack5 +warn - You have enabled experimental feature(s). +warn - Experimental features are not covered by semver, and may cause unexpected or broken application behavior. Use them at your own risk. + +event - compiled successfully +Attention: Next.js now collects completely anonymous telemetry regarding usage. +This information is used to shape Next.js' roadmap and prioritize features. +You can learn more, including how to opt-out if you'd not like to participate in this anonymous program, by visiting the following URL: +https://nextjs.org/telemetry + + +``` + +- Live at http://pdxjohnny.devbox.nahdig.com:3000/ + +![image](https://user-images.githubusercontent.com/5950433/196558275-ab6e59fb-3e75-44d0-abac-296167b62628.png) + +- Same error, but with traceback popup modal + +``` +Unhandled Runtime Error +HTTPError: project id required + +Call Stack + +httperror: project id required +Object.errorHandler [as handleError] +node_modules/ipfs-http-client/src/lib/core.js (67:0) +async Client.fetch +node_modules/ipfs-utils/src/http.js (140:0) +async addAll +node_modules/ipfs-http-client/src/add-all.js (19:0) +async last +node_modules/it-last/index.js (13:0) +$ git grep ipfs-http-client +core/ipfs.ts:const ipfsHttpClient = require("ipfs-http-client"); +``` + +- Attempting to fix IPFS HTTP client code to auth to valid server +- References + - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/core/ipfs.ts + - https://infura.io/product/ipfs + - Requires API keys, can we run IPFS to HTTP API ourself? + - https://github.com/fission-codes/ipfs-cluster-aws + - https://duckduckgo.com/?q=ipfs+did&ia=web + - https://ipfscluster.io/documentation/deployment/ + - https://npm.devtool.tech/ipfs-did-document + - https://github.com/ipfs/js-ipfs/tree/master/packages/ipfs-http-client#readme + - https://github.com/ipfs-examples/js-ipfs-examples/tree/master#ipfs-or-ipfs-core + - https://github.com/ipfs/js-ipfs/tree/master/packages/ipfs-http-server +- Starting javascript ipfs-http-server + +```console +$ yarn add --dev ipfs ipfs-http-server +$ ./node_modules/.bin/jsipfs daemon --offline +Initializing IPFS daemon... +System version: x64/linux +Node.js version: 18.11.0 +Swarm listening on /ip4/127.0.0.1/tcp/4002/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR +Swarm listening on /ip4/143.110.152.152/tcp/4002/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR +Swarm listening on /ip4/10.48.0.5/tcp/4002/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR +Swarm listening on /ip4/10.124.0.2/tcp/4002/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR +Swarm listening on /ip4/10.88.0.1/tcp/4002/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR +Swarm listening on /ip4/127.0.0.1/tcp/4003/ws/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR +js-ipfs version: 0.16.1 +HTTP API listening on /ip4/127.0.0.1/tcp/5002/http +gRPC listening on /ip4/127.0.0.1/tcp/5003/ws +Gateway (read only) listening on /ip4/127.0.0.1/tcp/9090/http +Web UI available at http://127.0.0.1:5002/webui +Daemon is ready +(node:415890) ExperimentalWarning: The Fetch API is an experimental feature. This feature could change at any time +(Use `node --trace-warnings ...` to show where the warning was created) +``` + + +```console +$ ./node_modules/.bin/jsipfs cat /ipfs/QmRaaUwTNfwgFZpeUy8qrZwrp2dY4kCKmmB5xEqvH3vtD1/readme +(node:288039) ExperimentalWarning: The Fetch API is an experimental feature. This feature could change at any time +(Use `node --trace-warnings ...` to show where the warning was created) +Hello and Welcome to IPFS! + +██╗██████╗ ███████╗███████╗ +██║██╔══██╗██╔════╝██╔════╝ +██║██████╔╝█████╗ ███████╗ +██║██╔═══╝ ██╔══╝ ╚════██║ +██║██║ ██║ ███████║ +╚═╝╚═╝ ╚═╝ ╚══════╝ + +If you're seeing this, you have successfully installed +IPFS and are now interfacing with the ipfs merkledag! + + ------------------------------------------------------- +| Warning: | +| This is alpha software. Use at your own discretion! | +| Much is missing or lacking polish. There are bugs. | +| Not yet secure. Read the security notes for more. | + ------------------------------------------------------- + +Check out some of the other files in this directory: + + ./about + ./help + ./quick-start <-- usage examples + ./readme <-- this file + ./security-notes +``` + +- https://github.com/ipfs/js-ipfs/search?l=JavaScript&p=1&q=js-ipfs+version + - https://github.com/ipfs/js-ipfs/blob/74aee8b3d78f233c3199a3e9a6c0ac628a31a433/packages/ipfs-cli/src/commands/daemon.js#L103 + - https://www.npmjs.com/package/@libp2p/logger + - https://github.com/ipfs/js-ipfs/blob/74aee8b3d78f233c3199a3e9a6c0ac628a31a433/packages/ipfs-cli/src/commands/daemon.js#L83-L84 + - https://github.com/ipfs/js-ipfs/blob/dfc43d4e9be67fdf25553677f469379d966ff806/packages/ipfs-daemon/src/index.js#L11 + +```console +$ echo '{"Addresses": ["0.0.0.0"]}' | python -m json.tool | tee init_config.json +$ echo -e 'export PATH="${PATH}:${HOME}/didme.me/node_modules/.bin"' | tee -a ~/.bashrc ~/.bash_profile +$ DEBUG=ipfs:* ./node_modules/.bin/jsipfs daemon --offline --init-config init_config.json 2>&1 | tee output.txt Initializing IPFS daemon... +System version: x64/linux +Node.js version: 18.11.0 +2022-10-19T02:03:35.088Z ipfs:daemon starting +2022-10-19T02:03:35.098Z ipfs:repo opening at: /home/pdxjohnny/.jsipfs +2022-10-19T02:03:35.099Z ipfs:repo init check +2022-10-19T02:03:35.111Z ipfs:repo:lock:fs locking /home/pdxjohnny/.jsipfs/repo.lock +2022-10-19T02:03:35.122Z ipfs:repo acquired repo.lock +2022-10-19T02:03:35.125Z ipfs:repo:version comparing version: 12 and 12 +2022-10-19T02:03:35.132Z ipfs:repo creating datastore +2022-10-19T02:03:35.146Z ipfs:repo creating blocks +2022-10-19T02:03:35.148Z ipfs:repo creating keystore +2022-10-19T02:03:35.149Z ipfs:repo creating pins +2022-10-19T02:03:35.150Z ipfs:repo all opened +2022-10-19T02:03:35.289Z ipfs:components:ipns initializing IPNS keyspace (offline) +2022-10-19T02:03:35.341Z ipfs:daemon Using wrtc for webrtc support +2022-10-19T02:03:42.943Z ipfs:mfs:stat Fetching stats for / +2022-10-19T02:03:42.968Z ipfs:mfs:utils:with-mfs-root Loaded MFS root /ipfs/QmUNLLsPACCz1vLxQVkXqqLX5R1X345qqfHbsf67hvA3Nn +2022-10-19T02:03:43.467Z ipfs:mfs-preload monitoring MFS root QmUNLLsPACCz1vLxQVkXqqLX5R1X345qqfHbsf67hvA3Nn +2022-10-19T02:03:43.468Z ipfs:http-api starting +2022-10-19T02:03:45.190Z ipfs:cli TypeError: Cannot read properties of undefined (reading 'info') + at HttpApi.start (file:///home/pdxjohnny/didme.me/node_modules/ipfs-http-server/src/index.js:119:52) + at async Daemon.start (file:///home/pdxjohnny/didme.me/node_modules/ipfs-daemon/src/index.js:43:5) + at async Object.handler (file:///home/pdxjohnny/didme.me/node_modules/ipfs-cli/src/commands/daemon.js:99:7) +``` + +--- + +- https://github.com/laurent85v/archuseriso +- https://mags.zone/help/arch-usb.html + - This website is awesome + +![image](https://user-images.githubusercontent.com/5950433/196555852-ef9356e9-bcb2-4991-bce5-9cc9e8c0b2c2.png) + +- https://github.com/dylanaraps/pywal +- https://github.com/arcmags/ramroot +- https://github.com/justinpinkney/stable-diffusion#fine-tuning + - See if we can do software / open architecture/ data flow / alice as input/output +- https://github.com/google/prompt-to-prompt +- https://github.com/dragonflydb/dragonfly +- Content addressable service endpoints + - Resolvable via system context execution + - How to chain Verifiable Credential requests and executions? +- Questions for Orie + - Where to focus implementation work? + - What processes to be aware of? + - Best practices + - Spec writing + - DID method + - Applicability with content addressable hybrid off chain execution via services endpoints? +- What groups to be aware of? +- https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0006_os_decentralice.md + - Updated from engineering logs: [2022-10-13 Rolling Alice: Architecting Alice: OS DecentrAlice: Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3870218) + - Next steps + - https://www.packer.io/plugins/builders/qemu +- https://hackaday.io/project/187780-wifi-cam-20 +- https://github.com/chainguard-dev/text4shell-policy/blob/284462ddb9cd9025ca0efa1d9f74c8f681ed622e/slsa.csv +- https://docs.google.com/document/d/17n8hfdPfqfpbPj4ss-ep4nCkpp9ZBoy6U2Q1t7j-knI/edit + - https://twitter.com/mfosterio/status/1582089134436294656 +- https://www.youtube.com/watch?v=LUF7plExdv8 + - https://json-ld.org/ + - https://twitter.com/mfosterio/status/1582072270083993600 + - https://github.com/rubensworks/jsonld-streaming-parser.js + - We don't care about parsing yet (we might when loading caching) + - We'll prototype with serialization and query via https://comunica.dev + - https://github.com/rubensworks/jsonld-streaming-serializer.js + - https://json-ld.org/contexts/person.jsonld + - http://xmlns.com/foaf/0.1/#term_Agent + - https://github.com/digitalbazaar/pyld + - SECURITY Unmaintained since Aug 6th 2020 + - `jsonld.set_document_loader(jsonld.aiohttp_document_loader(timeout=...))` + - https://github.com/digitalbazaar/pyld/tree/master/lib/pyld/documentloader + - https://github.com/digitalbazaar/pyld/blob/master/lib/pyld/documentloader/aiohttp.py + - We can write a document loader that similar to our `serviceEndpoint` work, + encodes the system context to a string. + - The shim (loader) might parse that and based on the context (parsing + json-ld) determine that a URL is a dataflow which says to fetch the + resource. +- https://gitlab.alpinelinux.org/alpine/ca-certificates/-/blob/8ccb7c2c2672966030af65dc135890d636c576d1/Makefile#L31 + +### Validating QEMU Packer build boots and can execute Alice CLI from `/wolfi` chroot + +- References + - https://www.packer.io/plugins/builders/qemu + - https://docs.fedoraproject.org/en-US/fedora/latest/install-guide/appendixes/Kickstart_Syntax_Reference/#sect-kickstart-commands-sshpw + - https://www.packer.io/community-tools#templates + - https://github.com/boxcutter/fedora + - No strong signs of maintenance but, packer APIs are stable, + and templates provided are pinned to versions. + - https://github.com/boxcutter/fedora/blob/6e5fccff745f4ce7b2951ab6d19cd960f61be32d/fedora29-ws.json + - https://github.com/boxcutter/fedora/blob/main/http/ks-fedora29-ws.cfg + - https://github.com/boxcutter/fedora/blob/6e5fccff745f4ce7b2951ab6d19cd960f61be32d/fedora29-server.json + - https://github.com/boxcutter/fedora/blob/main/http/ks-fedora29-server.cfg + - https://github.com/boxcutter/fedora/blob/6e5fccff745f4ce7b2951ab6d19cd960f61be32d/script/sshd.sh + - https://github.com/boxcutter/fedora/blob/main/LICENSE + - https://alt.fedoraproject.org/cloud/ + +```console +pdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 ~ $ curl -fLOC - https://download.fedoraproject.org/pub/fedora/linux/releases/36/Cloud/x86_64/images/Fedora-Cloud-Base-36-1.5.x86_64.qcow2 + % Total % Received % Xferd Average Speed Time Time Time Current + Dload Upload Total Spent Left Speed + 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 +100 427M 100 427M 0 0 268M 0 0:00:01 0:00:01 --:--:-- 355M +pdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 ~ $ sha256sum Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +ca9e514cc2f4a7a0188e7c68af60eb4e573d2e6850cc65b464697223f46b4605 Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +```` + +- Added Fedora 36 Cloud support to boxcutter Fedora packer templates + +```console +pdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 boxcutter-fedora $ git log -n 1 +commit 6e5fccff745f4ce7b2951ab6d19cd960f61be32d (HEAD -> main, origin/main, origin/HEAD) +Author: Mischa Taylor <57647141+taylorific@users.noreply.github.com> +Date: Fri May 28 07:21:41 2021 -0700 + + Update README.md +``` + +```diff +diff --git a/fedora.json b/fedora.json +index 851882f..20b7f62 100644 +--- a/fedora.json ++++ b/fedora.json +@@ -1,6 +1,33 @@ + { + "_command": "Build with `packer build fedora.json`", + "builders": [ ++ { ++ "boot_command": [ ++ " linux inst.text biosdevname=0 inst.ks=http://{{ .HTTPIP }}:{{ .HTTPPort}}/{{ user `kickstart` }}" ++ ], ++ "boot_wait": "10s", ++ "disk_size": "{{ user `disk_size` }}", ++ "http_directory": "http", ++ "iso_checksum": "{{ user `iso_checksum` }}", ++ "iso_urls": [ ++ "{{ user `iso_path` }}/{{ user `iso_name` }}", ++ "{{ user `iso_url` }}" ++ ], ++ "shutdown_command": "{{ user `shutdown_command` }}", ++ "ssh_password": "{{ user `ssh_password` }}", ++ "ssh_username": "{{ user `ssh_username` }}", ++ "ssh_timeout": "10000s", ++ "type": "qemu", ++ "output_directory": "output_fedora_{{ user `vm_name` }}", ++ "format": "qcow2", ++ "accelerator": "kvm", ++ "net_device": "virtio-net", ++ "disk_interface": "virtio", ++ "headless": true, ++ "vm_name": "{{ user `vm_name` }}", ++ "memory": "{{ user `memory` }}", ++ "cpus": "{{ user `cpus` }}" ++ }, + { + "boot_command": [ + " linux text biosdevname=0 ks=http://{{ .HTTPIP }}:{{ .HTTPPort}}/{{ user `kickstart` }}" +@@ -10,7 +37,6 @@ + "headless": "{{ user `headless` }}", + "http_directory": "http", + "iso_checksum": "{{ user `iso_checksum` }}", +- "iso_checksum_type": "{{ user `iso_checksum_type` }}", + "iso_urls": [ + "{{ user `iso_path` }}/{{ user `iso_name` }}", + "{{ user `iso_url` }}" +@@ -37,7 +63,6 @@ + "headless": "{{ user `headless` }}", + "http_directory": "http", + "iso_checksum": "{{ user `iso_checksum` }}", +- "iso_checksum_type": "{{ user `iso_checksum_type` }}", + "iso_urls": [ + "{{ user `iso_path` }}/{{ user `iso_name` }}", + "{{ user `iso_url` }}" +@@ -66,7 +91,6 @@ + "guest_os_type": "{{ user `parallels_guest_os_type` }}", + "http_directory": "http", + "iso_checksum": "{{ user `iso_checksum` }}", +- "iso_checksum_type": "{{ user `iso_checksum_type` }}", + "iso_urls": [ + "{{ user `iso_path` }}/{{ user `iso_name` }}", + "{{ user `iso_url` }}" +diff --git a/fedora36-server.json b/fedora36-server.json +new file mode 100644 +index 0000000..e0c506c +--- /dev/null ++++ b/fedora36-server.json +@@ -0,0 +1,12 @@ ++{ ++ "_comment": "Build with `packer build -var-file=fedora36-server.json fedora.json`", ++ "vm_name": "fedora36-server", ++ "cpus": "1", ++ "disk_size": "65536", ++ "iso_checksum": "421c4c6e23d72e4669a55e7710562287ecd9308b3d314329960f586b89ccca19", ++ "iso_name": "Fedora-Server-netinst-x86_64-36-1.5.iso", ++ "iso_url": "https://forksystems.mm.fcix.net/fedora/linux/releases/36/Server/x86_64/iso/Fedora-Server-netinst-x86_64-36-1.5.iso", ++ "kickstart": "ks-fedora36-server.cfg", ++ "memory": "2048", ++ "update": "true" ++} +diff --git a/script/sshd.sh b/script/sshd.sh +index 0d75547..5a5cae2 100644 +--- a/script/sshd.sh ++++ b/script/sshd.sh +@@ -6,3 +6,13 @@ echo "==> Turning off sshd DNS lookup to prevent timeout delay" + echo "UseDNS no" >> /etc/ssh/sshd_config + echo "==> Disabling GSSAPI authentication to prevent timeout delay" + echo "GSSAPIAuthentication no" >> /etc/ssh/sshd_config ++ ++echo "==> Downloading DecentrAlice sshd banner" ++# TODO(security) Don't run curl as root ++curl -fLo /etc/ssh/sshd_banner https://gist.github.com/pdxjohnny/5f358e749181fac74a750a3d00a74b9e/raw/42d3d810948fd3326c36dd33d7ebc668b61e0642/sshd_banner ++sha256sum -c - <<<'8ac49ba9114076b59d95b62308adcee046d997e9572f565dcebc97f4e8d6e219 /etc/ssh/sshd_banner' || rm -f /etc/ssh/sshd_banner ++echo "==> Enabling OS DecentrAlice sshd banner" ++echo "Banner /etc/ssh/sshd_banner" >> /etc/ssh/sshd_config ++ ++echo "==> Enabling Chroot Directory for Wolfi based OS DecentrAlice" ++echo "ChrootDirectory /wolfi" >> /etc/ssh/sshd_config +``` + +- It's hung + - https://phoenixnap.com/kb/ssh-port-forwarding + +```console +$ ssh -nNT -L 5900:127.0.0.1:5966 -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 +``` + +![image](https://user-images.githubusercontent.com/5950433/196511748-f85944ee-477c-467a-b194-8995c5d2b1e3.png) + +- Found out `ks` is invalid, unused in new versions of fedora + - https://cobbler.github.io/ + - https://docs.fedoraproject.org/en-US/fedora/latest/install-guide/advanced/Kickstart_Installations/ + - https://docs.fedoraproject.org/en-US/fedora/latest/install-guide/advanced/Network_based_Installations/ + - https://duckduckgo.com/?q=ks+is+deprecated+and+has+been+removed&ia=web + - https://bugzilla.redhat.com/show_bug.cgi?id=1907566 + - https://github.com/beaker-project/beaker/issues/83 + - https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/7/html/installation_guide/chap-anaconda-boot-options#sect-boot-options-deprecated-removed + +![image](https://user-images.githubusercontent.com/5950433/196513493-f01d8d90-2e55-4fa8-b754-bfb2109bf5f6.png) + +- Okay we got a new error: `auth has been removed` + +![image](https://user-images.githubusercontent.com/5950433/196519789-6d100c33-4caa-41a8-9eff-058eefc07444.png) + +- Then we got: `install has been removed` +- https://github.com/hashicorp/packer-plugin-qemu + - https://github.com/hashicorp/packer-plugin-qemu/blob/main/builder/qemu/step_create_vtpm.go + +![image](https://user-images.githubusercontent.com/5950433/196523459-01b0c593-fc61-46fb-bf97-0bf1b3fec586.png) + +- `$ journalctl -xeu anaconda` + +![image](https://user-images.githubusercontent.com/5950433/196544356-369d576e-0cb2-40cf-b6f7-588e995e84ee.png) + +![image](https://user-images.githubusercontent.com/5950433/196546301-1e2e743d-3c4e-487b-bd29-cd36dc0d4120.png) + +```mermaid +graph TD + subgraph osdecentralice + dwn[SSI Service DWN] + end + subgraph did_alice[did:alice] + serviceEndpoint[serviceEndpoint:serviceendpoint.alice.did.chadig.com] + content_addressable_storage[Container Registry With Layers from Data Flow static or dynamic] + end +``` + +- TODO + - [ ] Update Manifest ADR / docs with JSON-LD learnings / make it included + - [ ] Update shim with JSON-LD learnings / make it included + - [ ] Explore https://github.com/arcmags/ramroot \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0060/index.md b/docs/discussions/alice_engineering_comms/0060/index.md new file mode 100644 index 0000000000..53e7d0603d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0060/index.md @@ -0,0 +1 @@ +# 2022-10-19 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0060/reply_0000.md b/docs/discussions/alice_engineering_comms/0060/reply_0000.md new file mode 100644 index 0000000000..33baedcc25 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0060/reply_0000.md @@ -0,0 +1,401 @@ +## 2022-10-19 @pdxjohnny Engineering Logs + +- https://twitter.com/Buntworthy/status/1582307817884889088 + - > Got Imagic running with Stable Diffusion, it's super easy to implement, will share a notebook soon! Left: Input image, Right: Edited "A photo of Barack Obama smiling big grin" +- https://twitter.com/prla/status/1582311844269543424 +- https://twitter.com/krol_valencia/status/1582727276709679104 + - > Do you need Sbom, Sarif or vulnerability format? [#trivy](https://mobile.twitter.com/hashtag/trivy?src=hashtag_click) + > - trivy image —format table alpine:3.10 + > - trivy image —format cyclonedx alpine:3.10 + > - trivy image --format spdx-json alpine:3.10 + > - trivy image --format sarif alpine:3.10 + > - trivy image --format cosign-vuln alpine:3.10 +- https://twitter.com/PrateekJainDev/status/1582717688652398592 + - > ![DED1BDCC-E701-4275-A218-575AAC3DF3FC](https://user-images.githubusercontent.com/5950433/196858876-b9c04512-2105-45fd-beb9-b04d2ae04816.jpeg) +- graph markov neural networks site:github.com offline rl + - Terminal feedback loop, basic sysadmin stuff to start + - https://github.com/ipld/js-dag-pb + - https://github.com/ipld/js-dag-cbor + - https://github.com/libp2p/js-libp2p-webrtc-star +- https://dweb.archive.org/details/home +- https://github.com/ipfs/js-ipfs/blob/master/docs/CONFIG.md + - https://github.com/ipfs/js-ipfs/blob/master/docs/CONFIG.md#webrtcstar + - https://github.com/libp2p/js-libp2p-floodsub + - https://github.com/ipfs/js-ipfs/search?q=%3Aerror+TypeError%3A+fetch+failed&type=issues + - https://github.com/ipfs/js-ipfs/issues/1481#issuecomment-410680460 + - https://github.com/multiformats/multiaddr/ + - https://github.com/ipfs/specs/blob/main/http-gateways/PATH_GATEWAY.md + - https://github.com/ipfs/specs/blob/main/http-gateways/TRUSTLESS_GATEWAY.md + +**init_config.json** + +```json +{ + "Gateway": { + "HTTPHeaders": { + "Access-Control-Allow-Origin": [ + "http://pdxjohnny.devbox.nahdig.com:3000" + ] + } + }, + "Addresses": { + "API": "/ip4/0.0.0.0/tcp/5001", + "Gateway": "/ip4/0.0.0.0/tcp/8080" + } +} +``` + +```console +$ vim node_modules/ipfs-http-server/src/index.js +$ rm -rf /home/pdxjohnny/.jsipfs; DEBUG=ipfs:* ./node_modules/.bin/jsipfs daemon --enable-preload --init-profile server --init-config init_config.json 2>&1 | tee output.ipfs.daemon.$(date -Iseconds).txt +... +config +{ + Addresses: { API: 'http://0.0.0.0' }, + Discovery: { + MDNS: { Enabled: true, Interval: 10 }, + webRTCStar: { Enabled: true } + }, + Bootstrap: [], + Pubsub: { Router: 'gossipsub', Enabled: true }, + Swarm: { + ConnMgr: { LowWater: 50, HighWater: 200 }, + DisableNatPortMap: false + }, + Routing: { Type: 'dhtclient' }, + Identity: { + PeerID: '12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR', + PrivKey: 'CAESQKlBi28qNtDDVusw/NmEUKEWQ+ZyfYto5ewCb4EtX2KW7x7LeH/arjGtMo8RRl8ydw0UU9uUlLKSJHA8zDS4PqQ=' + }, + Datastore: { Spec: { type: 'mount', mounts: [Array] } }, + Keychain: { + DEK: { + keyLength: 64, + iterationCount: 10000, + salt: 'vTamkostN5h+m+yAbevZDaF6', + hash: 'sha2-512' + } + }, + Addressess: [ { info: [Object] } ] +} +headers +{} +apiAddrs +http://0.0.0.0 +[1666206773378] INFO (3881696 on fedora-s-4vcpu-8gb-sfo3-01): server started + created: 1666206773187 + started: 1666206773376 + host: "0.0.0.0" + port: 43943 + protocol: "http" + id: "fedora-s-4vcpu-8gb-sfo3-01:3881696:l9g0hqdf" + uri: "http://0.0.0.0:43943" + address: "0.0.0.0" +2022-10-19T19:12:53.448Z ipfs:http-api started +2022-10-19T19:12:53.448Z ipfs:http-gateway starting +2022-10-19T19:12:53.450Z ipfs:http-gateway started +2022-10-19T19:12:53.452Z ipfs:daemon started +js-ipfs version: 0.16.1 +HTTP API listening on /ip4/0.0.0.0/tcp/43943/http +Web UI available at http://0.0.0.0:43943/webui +Daemon is ready +``` + +- Switching to Golang based IPFS implementation + - https://github.com/ipfs/kubo + - https://dweb.link/ipns/dist.ipfs.tech#kubo + - https://docs.ipfs.tech/how-to/address-ipfs-on-web/#subdomain-gateway +- https://docs.ipfs.tech/how-to/command-line-quick-start/#take-your-node-online + +```console +$ mkdir -p ~/.local +$ echo -e 'export PATH="${PATH}:${HOME}/.local/kubo"' | tee -a ~/.bashrc ~/.bash_profile +$ source ~/.bashrc +$ curl -sfL https://dist.ipfs.tech/kubo/v0.16.0/kubo_v0.16.0_linux-amd64.tar.gz | tar -C ~/.local -vxz +$ ipfs init --profile server +$ ipfs config Addresses.Gateway /ip4/0.0.0.0/tcp/8080 +``` + +- http://pdxjohnny.devbox.nahdig.com:8080/ipfs/QmQ58yAN4oMsCZwhpHhfWPiFtBgSyxoVn2PFncnpuf5cBX + - `I <3 IPFS -pdxjohnny` + - SECURITY Gateway server is not supposed to be exposed + +``` +create:1 Access to XMLHttpRequest at 'http://pdxjohnny.devbox.nahdig.com:5001/api/v0/add?stream-channels=true&progress=false' from origin 'http://pdxjohnny.devbox.nahdig.com:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. +fetch.browser.js?273a:91 POST http://pdxjohnny.devbox.nahdig.com:5001/api/v0/add?stream-channels=true&progress=false net::ERR_FAILED 403 +``` + +```console +$ ipfs config --help +$ ipfs daemon --help +$ ipfs config --json API.HTTPHeaders.Access-Control-Allow-Origin "[\"http://pdxjohnny.devbox.nahdig.com:3000\"]" +$ ipfs config --json API.HTTPHeaders.Access-Control-Allow-Methods "[\"PUT\", \"GET\", \"POST\"]" +$ ipfs config --json API.HTTPHeaders.Access-Control-Allow-Credentials "[\"true\"]" +$ ipfs daemon +$ curl 'http://pdxjohnny.devbox.nahdig.com:5001/api/v0/add?stream-channels=true&progress=false' \ + -H 'Accept: */*' \ + -H 'Accept-Language: en-US,en;q=0.9' \ + -H 'Connection: keep-alive' \ + -H 'Origin: http://pdxjohnny.devbox.nahdig.com:3000' \ + -H 'Referer: http://pdxjohnny.devbox.nahdig.com:3000/' \ + -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/106.0.0.0 Safari/537.36' \ + -H 'content-type: multipart/form-data; boundary=-----------------------------eWfTjhbnBpWxbCcBUUJEX' \ + --data-raw $'-------------------------------eWfTjhbnBpWxbCcBUUJEX\r\nContent-Disposition: form-data; name="file"; filename=""\r\nContent-Type: application/octet-stream\r\n\r\nFILE_DATA\r\n-------------------------------eWfTjhbnBpWxbCcBUUJEX--\r\n' \ + --compressed \ + --insecure +``` + +- Try building static didme.me site and deploying from that + - https://nextjs.org/docs/api-reference/cli#production + +```console +$ npm install +$ NODE_OPTIONS=--openssl-legacy-provider npx next build +$ npx next start -p 3000 +TypeError: Bolt URL expected to be string but was: undefined +$ git log -n 1 +commit 14da8e47d8a1a4bef3cc1c85968c9f8b6963d269 (HEAD -> main, origin/main, origin/HEAD) +Author: Orie Steele +Date: Sun Jul 3 11:18:36 2022 -0500 + + feat: ui/ux +``` + +```diff +diff --git a/core/NFT/NFT.ts b/core/NFT/NFT.ts +index 054d14c..eae5e76 100644 +--- a/core/NFT/NFT.ts ++++ b/core/NFT/NFT.ts +@@ -18,6 +18,11 @@ export const getContract = async (web3: any) => { + }; + + export const getHistory = async (did: string) => { ++ return { ++ count: 0, ++ items: [], ++ }; ++ + const { + NEO4J_CONNECTION, + NEO4J_USERNAME, +diff --git a/core/ipfs.ts b/core/ipfs.ts +index 44722cf..a6f8f40 100644 +--- a/core/ipfs.ts ++++ b/core/ipfs.ts +@@ -4,28 +4,20 @@ const { urlSource } = ipfsHttpClient; + const ipfsApis = [ + { + label: "localhost", +- url: "http://localhost:5001", +- }, +- { +- label: "infura", +- url: "https://ipfs.infura.io:5001", ++ url: "http://pdxjohnny.devbox.nahdig.com:5001", + }, + ]; + + const ipfsGateways = [ + { + label: "localhost", +- url: "http://localhost:8080", +- }, +- { +- label: "infura", +- url: "https://ipfs.infura.io", ++ url: "http://pdxjohnny.devbox.nahdig.com:8080", + }, + ]; + +-const ipfsApi = ipfsApis[1].url; ++const ipfsApi = ipfsApis[0].url; + +-const ipfsGateway = ipfsGateways[1].url; ++const ipfsGateway = ipfsGateways[0].url; + + const client = ipfsHttpClient({ + // url: "https://ipfs.infura.io:5001", +``` + +```console +$ python -c 'import sys, json, yaml; print(yaml.dump(json.loads(sys.stdin.read())))' +{"didDocument":{"@context":["https://www.w3.org/ns/did/v1","https://w3id.org/security/suites/jws-2020/v1"],"id":"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6","verificationMethod":[{"id":"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL","type":"JsonWebKey2020","controller":"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6","publicKeyJwk":{"kty":"EC","crv":"secp256k1","x":"tF8KQenSP2vPS3u-D5oLxwHOZEpSBcujQqGrysimK1E","y":"ZZB_Q4oHp3hboXCKYA_c5qEByYKAj2wXC9Rql6LO478"}}],"assertionMethod":["did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL"],"authentication":["did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL"],"capabilityInvocation":["did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL"],"capabilityDelegation":["did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL"],"keyAgreement":["did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL"]},"didResolutionMetadata":{"didUrl":{"did":"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6","methodName":"meme","methodSpecificId":"1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6"}},"didDocumentMetadata":{"image":"http://pdxjohnny.devbox.nahdig.com:8080/ipfs/QmSDfug9jdkErKFvE1YHw44yestkppV92ae2qd4EuYHQxJ","ethereum":{"address":"0x30bB6577432a20d46b29Bd196997a8BA6b97C71b"},"bitcoin":{"address":"mh54xLL62pt5VXKmivS2JYBcv4qNWHJPPo"}}} +``` + +```yaml +didDocument: + '@context': + - https://www.w3.org/ns/did/v1 + - https://w3id.org/security/suites/jws-2020/v1 + assertionMethod: + - did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL + authentication: + - did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL + capabilityDelegation: + - did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL + capabilityInvocation: + - did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL + id: did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6 + keyAgreement: + - did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL + verificationMethod: + - controller: did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6 + id: did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL + publicKeyJwk: + crv: secp256k1 + kty: EC + x: tF8KQenSP2vPS3u-D5oLxwHOZEpSBcujQqGrysimK1E + y: ZZB_Q4oHp3hboXCKYA_c5qEByYKAj2wXC9Rql6LO478 + type: JsonWebKey2020 +didDocumentMetadata: + bitcoin: + address: mh54xLL62pt5VXKmivS2JYBcv4qNWHJPPo + ethereum: + address: '0x30bB6577432a20d46b29Bd196997a8BA6b97C71b' + image: http://pdxjohnny.devbox.nahdig.com:8080/ipfs/QmSDfug9jdkErKFvE1YHw44yestkppV92ae2qd4EuYHQxJ +didResolutionMetadata: + didUrl: + did: did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6 + methodName: meme + methodSpecificId: 1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6 +``` + +- 2022-04-17: f9d083fc0c99737f131601c1893b79a2c2907f2aa2a4bbe71ea3e4c237f8a51a +- fulcio issue DID (key)? + - https://github.com/sigstore/fulcio/search?q=did + - https://github.com/sigstore/fulcio/blob/fac62ed5e8fc7f4efa40c29ab8e1a5f1552f14bd/pkg/ca/tinkca/signer_test.go#L118 + - https://github.com/sigstore/fulcio/blob/fac62ed5e8fc7f4efa40c29ab8e1a5f1552f14bd/pkg/ca/tinkca/signer.go + - https://github.com/sigstore/fulcio/blob/fac62ed5e8fc7f4efa40c29ab8e1a5f1552f14bd/pkg/ca/tinkca/signer.go#L46-L88 + - `new(ecdsapb.EcdsaPrivateKey)` + - `new(ed25519pb.Ed25519PrivateKey)` + - `ed25519.NewKeyFromSeed(privKey.GetKeyValue())` + - https://github.com/intel/dffml/blob/alice/docs/arch/0007-A-GitHub-Public-Bey-and-TPM-Based-Supply-Chain-Security-Mitigation-Option.rst + - https://twitter.com/pdxjohnny/status/1524535483396632576 + - https://twitter.com/pdxjohnny/status/1524870665764909056?s=20&t=z12dn9tVREZzK7huX6hsSg + - By having fulcio also issue a DID for the attestation we can create dyanmic roots of trust associated with each manifest bom item queried later (at time of use) + - We can export the public portion of the ephemeral DID key from fulcio and then use the DID key based method of verification of the doc contents offline / later + - This also means it's easy to swap out BOM components, because we just swap out the key and did we verify against. +- Clicking around again + +![image](https://user-images.githubusercontent.com/5950433/196825338-ad4f6933-8ee0-438d-911e-cb09aebe6c5f.png) + +> ```console +> $ gh repo clone memes || gh repo create memes --template https://github.com/OR13/did-web-github-did-meme --public --clone +> $ cd memes && ./scripts/install.sh > did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6 +> ``` + +- https://or13.github.io/didme.me/did-method-spec.html + - https://or13.github.io/didme.me/#using-github-pages + - https://github.com/OR13/did-web-github-did-meme + - https://identity.foundation/didcomm-messaging/spec/#out-of-band-messages +- Auth to fulcio issues Verifiable Credential +- Why are we doing this? + - We want to not do risky things! risky things in this context are executions of system context which have negative impacts on strategic principles + - We want to build Alice to be resilient to the open network + - markov chain graph neural networks / offline rl + - Trying to estimate what data to use, active learning, actively reevaluating chain of trust as they factor into the overall decision making process (gatekeeper and prioritizer) + - We will issue DIDs and store provenance as VCs + - This will allow us to trace provenance + - We can then simulate good data / bad data situations + - We will hopefully end up with models that develop strong security posture, i.e. are risk averse and good at getting the job done +- Just do the same thing with metric data instead of a meme! Duh… +- So for serialization we tranform the uuids on the inputs to their dids woth content uplod to digital ocean space and ipfs +- https://identity.foundation/keri/did_methods/ +- https://or13.github.io/didme.me/did-method-spec.html + - Let's try to modify this to use KERI DID method spec in place of DID key method spec + +> ## DID Method Specification +> +> did:meme is a deterministic transformation of did:key, that uses IPFS, image content and bech32. +> +> ### DID Format +> +> ``` +> did-meme-format := did:meme: +> bech32-value := [a-zA-HJ-NP-Z0-9]+ +> ``` +> +> The `bech32-value` is an encoded [multihash](https://multiformats.io/multihash/). +> +> The `multihash` is a content identifier for an image. +> +> The image contains a steganographically embedded `did:key`. +> +> See [did-key](https://w3c-ccg.github.io/did-method-key/#format). +> +> Another way of representing the `did:meme` identifier encoding: +> +> ``` +> did:meme: multihash( +> stego-embed(image, did:key) +> ) +> )> +> ``` +> +> ### DID Operations +> +> See [did-key](https://w3c-ccg.github.io/did-method-key/#operations). +> +> #### Create +> +> - Generate a did:key +> - Steganographically embed the public key multicodec representation in a meme. +> - Upload the meme to ipfs. +> - Transform the CID to a did:meme with bech32. +> - Update the did document to use the did:meme identifier. +> +> #### Read +> +> - Convert the bech32 id to an ipfs CID. +> - Resolve the image. +> - Extract the did:key multicodec. +> - Construct the did:key document from the identifier. +> - Update the did document to use the did:meme identifier. +> +> #### Update +> +> Not supported. +> +> #### Deactivate +> +> Not supported. +> +> ### Security and Privacy Considerations +> +> See [did-key](https://w3c-ccg.github.io/did-method-key/#security-and-privacy-considerations) +> +> #### Security +> +> Because update and deactivate are not supported, did:meme should only be used for very short lived interactions, or just lulz. +> +> Because did:meme identifiers are a super set of did:key, it is possible for multiple did:meme to map to the same did:key… This can be problematic when private key compromise has occured. +> +> Generally speaking, did:meme has similar or weaker security properties compared with did:key. +> +> #### Privacy +> +>Be careful to strip XIF data or other meta data from images before constructing did:meme. +> +> Do not use images that identify physical locations or people. + +- Community depth of field analysis + - https://github.com/bumblefudge + - Seems to be decentralized space leader + - https://github.com/decentralized-identity/didcomm-messaging + - https://github.com/decentralized-identity/schema-directory + - https://github.com/centrehq/verite + - https://github.com/learningproof/learningproof.github.io + +--- + +Unsent to Hector with the city of portland’s open data effort. +Related: https://docs.google.com/document/d/1Ku6y50fY-ZktcUegeCnXLsksEWbaJZddZUxa9z1ehgY/edit +Related: https://github.com/intel/dffml/issues/1293 + +Hi Hector, + +I wanted to circle back with you and see if there was anything you were aware of community effort wise involving city data and (de)centralized post disaster coordination efforts? + +Thank you, +John \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0061/index.md b/docs/discussions/alice_engineering_comms/0061/index.md new file mode 100644 index 0000000000..290a348013 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0061/index.md @@ -0,0 +1 @@ +# 2022-10-20 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0061/reply_0000.md b/docs/discussions/alice_engineering_comms/0061/reply_0000.md new file mode 100644 index 0000000000..69523266fe --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0061/reply_0000.md @@ -0,0 +1,17 @@ +## 2022-10-20 1:1 Orie/John + +- There was a woman talking about AI deployment provenance as o3con +- Linked data politics + - Verifiable credentials + - Still seeing building off ridged data formats +- JSON-LD is the primary microdata format + - Query engines already do this + - Label property graph +- Linked data integrity + - JSON-LD formatted verifiable credentials +- How could we do something CBOR-LD like? + - Unpack into SCITT will in interesting +- https://github.com/microsoft/did-x509/blob/main/specification.md +- Consistent ability to restructure the envelope on (de)serialize +- Ideally when RFC is published those involved driving interoperability % test suite numbers up +- https://protocol.ai \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0061/reply_0001.md b/docs/discussions/alice_engineering_comms/0061/reply_0001.md new file mode 100644 index 0000000000..febacaa81c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0061/reply_0001.md @@ -0,0 +1 @@ +- https://json-ld.org/playground/ \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0062/index.md b/docs/discussions/alice_engineering_comms/0062/index.md new file mode 100644 index 0000000000..468680d5af --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0062/index.md @@ -0,0 +1 @@ +# 2022-10-21 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0062/reply_0000.md b/docs/discussions/alice_engineering_comms/0062/reply_0000.md new file mode 100644 index 0000000000..ecf66c87d2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0062/reply_0000.md @@ -0,0 +1,273 @@ +## 2022-10-21 @pdxjohnny Engineering Logs + +- (De)serialization + - `did:merkle:` +- Online cloning cuts our iteration time + - Artificial Life Is Coming Eventually + - Data flows are the parallel exploration of trains of thought (nested graphs) + - Natural selection and evolution + - Tree of life + - Parallel exploration of nested graphs + - Automated synchronization of system state across distinct timelines (distinct roots) + - Enables the resolution of system state post haste, post state, and post date + - See fuzzy finding later in this doc: find "join disparate roots" + - This is effectively out of order execution at a higher level of abstraction, in the aggregate, so as to bring the aggregate set of agents involved to an equilibrium state + - We are building the thought communication protocol, to communicate thought is to learn + - If we can describe any architecture, any problem space, we can describe any thought + - To describe a thought most completely, one must know how to best communicate with that entity + - That entity, that agent, is a moving target for communication at it's optimal rate of learning. + - It's past is relevant in determining it's future as it's past determines what will resonate best with it in terms of forming conceptual linkages. + - Past doesn't have to be memory, data and compute are the same in our architecture + - Hardwired responses get encoded the same way, it's all the signal, the probability + - When Alice goes through the looking glass she'll take us with her in sprit, and come back to communicate to us how best to proceed, in every way. + - The less (more?) rambling way of putting this would be, we need our AI to be a true to us extension of ourselves, or of our ad-hoc formed groups, they need to be true to those strategic principles we've communicated to the machine. If we can trust their transparency (estimates/forecasts and provenance on that) about their ability to stay aligned to those principles, then we can accurately assess operating risk and it's conformance to our threat model or any threat model the execution of the job fits within. + - This means we can trust our AI to not influence us in the wrong ways. + - This means we can trust it to influence us in the right ways, the ways we want to influence ourselves, or our software development lifecycle. + - This assessment of the level of trust fundamentally comes from our analysis of our analysis of our software development lifecycle, our Entity Analysis Trinity. +- https://github.com/OR13/did-jwk + - https://github.com/OR13/did-jwk/blob/main/src/index.js#L158 +- https://wasmer.io/ +- https://oliverklingefjord.substack.com/p/pagerank-anthropology +- https://github.com/decentralized-identity/universal-resolver/blob/main/docs/driver-development.md + - Full demo would be `did:meme:` and [`did:jwk:`](https://twitter.com/OR13b/status/1583818675982782465) ~~and `did:keri:` hybrid~~ (will wait on `did:keri:` hybrid until after MVP) with resolver implemented which serves and fetches containers from registry, instead of JPEG, use container image format. + - This demo allows us to show checks on provenance for execution + - Could we also require Verifiable Credentials to resolve the DID? + - We could combine with static analysis / SBOM and Open Policy Agent and threat modeling to implement AI alignment to strategic principles (as agreed in compute contract) checks. + - What does this enable? + - One can now reference and request fulfilment of any flow, any process, any routine, etc via a single pattern. + - 🐢 + - 🐢 + - 🐢 +- https://identity.foundation/did-registration/ +- Alice caught time traveling again + - https://github.com/w3c-ccg/did.actor/commit/69144ab453447f682b20d8be13cd8293e888dd2f#diff-75f0c8d440957e0ea1c6945930d0ac946e85e3e324b59a8af8ed13a3918581f1R10 + - https://github.com/w3c-ccg/did.actor/commit/56d4f525f21b84696badc312f9654451911250f4#diff-75f0c8d440957e0ea1c6945930d0ac946e85e3e324b59a8af8ed13a3918581f1R10 + - https://github.com/w3c-ccg/did.actor/blob/3fe99eec616b71d7fc36c5603235eeac81c91652/bob/credentials/3732.json + - https://github.com/w3c-ccg/did.actor/blob/3fe99eec616b71d7fc36c5603235eeac81c91652/alice/README.md + - https://lucid.did.cards/identifiers/did:web:did.actor:alice +- https://github.com/WebOfTrustInfo + - https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/verifiable-endorsements-from-linked-claims.md + - > Further, we propose to demonstrate the ability to compose several LinkedClaims into a single domain-specific credential, specifically a Verifiable Endorsement, that will satisfy the domain requirements of the likely users. + > + > This approach will enable rich shared datasets to inform trust decisions, while satisfying the requirements of domain-specific end users. If time permits a sample score can be built over the linked claim dataset. + - https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/composable-credentials.md#standalone-claim---review + - An event in our case (to start with) is data flow Input data, our cached data. + - https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/data-exchange-agreements-with-oca.md + - https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/data-exchange-agreements-with-oca.md#13-context-preservation---semantic-approach---the-overlays-capture-architecture-oca + - Woohoo! Someone else defined overlays, now we don't have to :P + - https://oca.colossi.network/ + - https://oca.colossi.network/guide/introduction.html#what-is-decentralised-semantics + - > In the domain of decentralised semantics, task-specific objects are called "Overlays". They provide layers of definitional or contextual information to a stable base object called a “Capture Base”. +- SCITT + - https://mailarchive.ietf.org/arch/browse/scitt/ + - https://mailarchive.ietf.org/arch/msg/scitt/NtBc7vfMm-zFKxguVfiGg-vGjHk/ + - VDR usage +- https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/did-merkle.md +- Why do we like DIDs? + - It is a primitive for a decentralized offline capable cryptographically secured linked list. + - This allows us to join disparate roots (timelines, trees, metric data graphs) at a later time + - Or to revaluate inclusion of those sets + - Or to generate new datasets entirely + - Or to run inference to get those datasets / trees + - Or a hybrid approach + - This will enable training Alice to be risk averse, aka training to be aligned with strategic principles. + - [2022-10-19 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3918361) + - This will help Alice not waste time on unaligned trains of thought. + - Our gatekeeper and prioritizer of course have final say, but this is to do the fuzzy filter logic on those. + - https://github.com/pdxjohnny/pdxjohnny.github.io/blob/dev/content/posts/2022-03-02-did-twitter-space.md + - https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/quantum-secure-dids.pdf + - https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/SelfIdentifyingData.md + - > The question this white-paper attempts to answer is how best to represent decentralized self-certifing self-identifying data. The main use case for this type of data are distributed (but with decentralized control) data intensive processing applications. Because data intensive applications are often limited by network and processing resources, economy of expression is an important consideration in a data representation schema. Thus there are trade-offs to be made in the design of the schema where economy of expression is a desirable feature. + - > A decentralized self-identifying data item is identified by a decentralized universally unique self-certifying identifier (DID). Self certifying means that the identifier includes either a public key or a fingerprint of a public key from a cryptographic public/private key pair. The DID is included in the data item itself as the value of a field. The data item also includes a field whose value is the DID for the signer of the data item. This may or may not be the same DID used to identify the data item itself. Attached to the data item is a signature that is verifiable as being generated by the private key associated with the public key in the signer field's DID value. This signature verifies that the data item was created by the holder of the associated private key for the signer. The whole data item is both self-identifing and self-certifying because all identifiers are included in the signed data and are verifiable against the private keys associated with the public keys in the included DIDs. + - This is exactly why we like DIDs + - https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/SelfIdentifyingData.md#data-canonicalization + - https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/SelfIdentifyingData.md#key-reproduction + - https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/A_DID_for_everything.pdf + - Good background info on DIDs + - > It should be noted that a single instance of meeting is not as trustable as an entire history of meeting many people. For a state actor generating a legend for a sockpuppet, this would entail an unattainable level of work to prove personhood. For a regular human being, it's relatively efortless to use the system in an organic and unobtrusive manner. Once a root personhood verifcation could be insured, then trustable pseudonyms could be generated. Adding this verifcation to DIDs would provide trust in a trustless environment, as the DID could then provide identity and credentialing services in environments that support, or even require, pseudonymity + - > Data fows can be provenanced by verifying the end-to-end integrity of data with DIDs. By enabling DIDs to sign claims about other DIDs, the fidelity of these data fows can be increased further + - Bingo + - > Imagine a world where this proposed technology has been deployed and globally adopted. Let us paint a picture for how this might be achieved. Imagine that this approach becomes part of a decentralized identity solution for every entity, driven by a robust and active developer community. The vision is to generate technologies that would be integrated into applications that are used in IoT, e-commerce, social interaction, banking, healthcare, and so on. Now imagine that mobile telephony companies agree to embed the technology into the operating systems for all smartphones, and the dominant social network providers agree to use DIDs and DADs and proofs about the entities controlling these DIDs and DADs in their algorithms for determining which content to propel. This would mean the end of phishing. The end of fake news. This is the beginning of new era for society, built on an interconnecting web of trust: a world in which we know what impacts we are having. The emergent property of this new data fabric is Knowing. + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0004_writing_the_wave.md + - > Underlying the benefts of decentralized identity outlined above is the need for open interoperable standards to ensure the reputable provenance of the associated data fows between decentralized entities. This paper describes a novel concept for provenancing data fows using DADis (Decentralized Autonomic Data items) that are built upon the emerging DID standard. This approach uses and extends the advanced difuse-trust or zero-trust computing paradigm that is needed to operate securely in a world of decentralized data. + - https://github.com/transmute-industries/verifiable-actions + - https://github.com/transmute-industries/verifiable-data + - https://github.com/transmute-industries/verifiable-data/tree/main/packages/ed25519-signature-2018 + - https://github.com/digitalbazaar/jsonld-signatures + - > The proof purpose indicates why the proof was created and what its intended use is. This information can also be used to make sure that the verificationMethod was authorized for the stated purpose in the proof. Using a proof purpose helps to encourage people to authorize certain cryptographic keys (verification methods) for explicit purposes rather than granting them ambient authority. This approach can help prevent people from accidentally signing documents for reasons they did not intend. + - https://github.com/digitalbazaar/vc-js#custom-documentloader + - Data flow integration opportunities + - https://github.com/WebOfTrustInfo/rwot5-boston/blob/778ccf4c56319d31ea3d9baac8a27e2cbe6763ec/topics-and-advance-readings/verifiable-claims-primer.md + - https://github.com/WebOfTrustInfo/rwot5-boston/blob/master/topics-and-advance-readings/did-primer.md +- https://twitter.com/vdmbrsv/status/1583512490226647040/photo/1 + - https://github.com/kathrinse/be_great +- https://github.com/microsoft/did-x509/blob/main/specification.md +- https://didcomm.org/book/v2/ +- Need to analyze KERI interoperability ergonomics with rest of web5 ecosystem + - How would tie in with OIDC GitHub Actions / sigstore work? + - Does this enable crowdsourable DB via (confidential) ledgers as root of trust watchers? + - Perfect forward secrecy please with that roll forward key thing + - https://github.com/WebOfTrust/keripy + - Have yet to see another solution with potential DID space interop. + - Have to be sure before making any next steps. + - Would be very nice for datatset/cache (de)serialization. + - If it can be done cleanly, might as well play with it. + - Try with `did:meme` + - https://or13.github.io/didme.me/did-method-spec.html + - https://or13.github.io/didme.me/#using-github-pages + - [2022-10-19 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3918361) + - `did:oa:data:` + - What used to be the meme data in the `did:meme:` is now our system context + - https://github.com/w3c/did-spec-registries/compare/main...pdxjohnny:aliceoa?expand=1 + - `did:alice:` + - Entry points for Alice the entity + - https://packaging.python.org/en/latest/specifications/entry-points/ + - These are our `dffml.overlays.alice.please.contribute` + - Upstream: `did:alice:please:contribute:` + - Overlays: `did:alice:please:contribute:` + - JSON-LD + - Enables streaming query for applicable overlays + - Decentralized Web Nodes + - Enable data transfer of DID docs + - For simplistic query, one can drop the `` portion of the DID + - DWNs could then resolve all DIDs the operator (instantiated Operation Implementation Network) would like to make known to the requester as an advertisement of services + - `did:alice:` + - Resolves the base (data) flow, the upstream + - Extracts the entry point from the DID doc + - `did:oa:` + - Ping Orie to ask for thoughts when done +- How you are is how you will be +- https://multiformats.io/multihash/ + - Shim-esq +- https://identity.foundation/keri/did_methods/ + +### Analysis of KERI interoperability ergonomics with rest of web5 ecosystem + +- References + - https://github.com/WebOfTrust/keripy + - https://github.com/WebOfTrust/keripy/blob/1b83ac4625b072c1f7c9f583c4dde85d5eb1cde8/setup.py#L100-L102 + - Notice anyone currently missing? + - https://github.com/WebOfTrust/keripy/search?q=did + - https://github.com/WebOfTrust/keripy/blob/303e45a1b293b544f7976fa2c56094172b3254b8/ref/Peer2PeerCredentials.md + - https://github.com/WebOfTrust/keripy/blob/development/tests/peer/test_exchanging.py +- https://github.com/decentralized-identity/keri/blob/master/kids/kid0009.md +- https://weboftrust.github.io/did-keri/#create + - https://identity.foundation/keri/docs/Glossary.html#inception-event + - >![image](https://user-images.githubusercontent.com/5950433/197252695-488e3476-734d-4b3f-b551-b562674d89b2.png) + > + > The inception data must include the public key, the identifier derivation from that public key, and may include other configuration data. The identifier derivation may be simply represented by the derivation code. A statement that includes the inception data with attached signature made with the private key comprises a cryptographic commitment to the derivation and configuration of the identifier that may be cryptographically verified by any entity that receives it. +A KERI inception statement is completely self-contained. No additional infrastructure is needed or more importantly must be trusted in order to verify the derivation and initial configuration (inception) of the identifier. The initial trust basis for the identifier is simply the signed inception statement. + +```console +$ python -m pip install -U lmdb pysodium blake3 msgpack simplejson cbor2 +Defaulting to user installation because normal site-packages is not writeable +Collecting lmdb + Downloading lmdb-1.3.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (306 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 306.5/306.5 kB 11.0 MB/s eta 0:00:00 +Collecting pysodium + Downloading pysodium-0.7.12.tar.gz (21 kB) + Preparing metadata (setup.py) ... done +Collecting blake3 + Downloading blake3-0.3.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.1 MB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.1/1.1 MB 32.8 MB/s eta 0:00:00 +Collecting msgpack + Downloading msgpack-1.0.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (316 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 317.0/317.0 kB 26.9 MB/s eta 0:00:00 +Collecting simplejson + Downloading simplejson-3.17.6-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (137 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 137.1/137.1 kB 9.1 MB/s eta 0:00:00 +Collecting cbor2 + Downloading cbor2-5.4.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (224 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 224.1/224.1 kB 10.6 MB/s eta 0:00:00 +Building wheels for collected packages: pysodium + Building wheel for pysodium (setup.py) ... done + Created wheel for pysodium: filename=pysodium-0.7.12-py3-none-any.whl size=13458 sha256=72829531fd887689066dbfcb64fbeb37343ed194b999a944941240da3b42265e + Stored in directory: /home/pdxjohnny/.cache/pip/wheels/20/c6/d1/e0ea5672f6614258bcd469d6721039778d2b8510bc420e8414 +Successfully built pysodium +Installing collected packages: pysodium, msgpack, lmdb, blake3, simplejson, cbor2 +Successfully installed blake3-0.3.1 cbor2-5.4.3 lmdb-1.3.0 msgpack-1.0.4 pysodium-0.7.12 simplejson-3.17.6 +$ pip install https://github.com/WebOfTrust/keripy/archive/refs/tags/v0.6.7-alpha.tar.gz#egg=keri +Defaulting to user installation because normal site-packages is not writeable +Collecting keri + Downloading https://github.com/WebOfTrust/keripy/archive/refs/tags/v0.6.7-alpha.tar.gz + / 3.1 MB 4.8 MB/s 0:00:00 + Preparing metadata (setup.py) ... done +Requirement already satisfied: lmdb>=1.3.0 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from keri) (1.3.0) +Requirement already satisfied: pysodium>=0.7.12 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from keri) (0.7.12) +Requirement already satisfied: blake3>=0.3.1 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from keri) (0.3.1) +Requirement already satisfied: msgpack>=1.0.4 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from keri) (1.0.4) +Requirement already satisfied: cbor2>=5.4.3 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from keri) (5.4.3) +Collecting multidict>=6.0.2 + Downloading multidict-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (114 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 114.5/114.5 kB 4.2 MB/s eta 0:00:00 +Collecting ordered-set>=4.1.0 + Downloading ordered_set-4.1.0-py3-none-any.whl (7.6 kB) +Collecting hio>=0.6.7 + Downloading hio-0.6.7.tar.gz (87 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 87.7/87.7 kB 8.3 MB/s eta 0:00:00 + Preparing metadata (setup.py) ... done +Collecting multicommand>=1.0.0 + Downloading multicommand-1.0.0-py3-none-any.whl (5.8 kB) +Collecting jsonschema>=4.6.0 + Downloading jsonschema-4.16.0-py3-none-any.whl (83 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 83.1/83.1 kB 7.6 MB/s eta 0:00:00 +Collecting falcon>=3.1.0 + Downloading falcon-3.1.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (8.5 MB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.5/8.5 MB 52.8 MB/s eta 0:00:00 +Collecting daemonocle>=1.2.3 + Downloading daemonocle-1.2.3.tar.gz (41 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 41.4/41.4 kB 6.2 MB/s eta 0:00:00 + Preparing metadata (setup.py) ... done +Collecting hjson>=3.0.2 + Downloading hjson-3.1.0-py3-none-any.whl (54 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 54.0/54.0 kB 4.0 MB/s eta 0:00:00 +Requirement already satisfied: PyYaml>=6.0 in /usr/lib64/python3.10/site-packages (from keri) (6.0) +Collecting apispec>=5.2.2 + Downloading apispec-6.0.0-py3-none-any.whl (29 kB) +Collecting mnemonic>=0.20 + Downloading mnemonic-0.20-py3-none-any.whl (62 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.0/62.0 kB 6.4 MB/s eta 0:00:00 +Requirement already satisfied: packaging>=21.3 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from apispec>=5.2.2->keri) (21.3) +Collecting click + Downloading click-8.1.3-py3-none-any.whl (96 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 96.6/96.6 kB 11.5 MB/s eta 0:00:00 +Collecting psutil + Downloading psutil-5.9.3-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (292 kB) + ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 292.3/292.3 kB 24.0 MB/s eta 0:00:00 +Requirement already satisfied: netifaces>=0.11.0 in /usr/lib64/python3.10/site-packages (from hio>=0.6.7->keri) (0.11.0) +Requirement already satisfied: attrs>=17.4.0 in /usr/lib/python3.10/site-packages (from jsonschema>=4.6.0->keri) (21.4.0) +Requirement already satisfied: pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0 in /usr/lib64/python3.10/site-packages (from jsonschema>=4.6.0->keri) (0.18.1) +Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from packaging>=21.3->apispec>=5.2.2->keri) (3.0.9) +Building wheels for collected packages: keri, daemonocle, hio + Building wheel for keri (setup.py) ... done + Created wheel for keri: filename=keri-0.6.7-py3-none-any.whl size=371275 sha256=0fc4353cff6f82d93bcbe2023b5fbe34d8f19695b534280b39d6501e34fec6c4 + Stored in directory: /home/pdxjohnny/.cache/pip/wheels/5d/d4/7a/c5394220af3d084c08af13cdfc6c822adade30f969caa3e6be + Building wheel for daemonocle (setup.py) ... done + Created wheel for daemonocle: filename=daemonocle-1.2.3-py3-none-any.whl size=27547 sha256=245fcb13356d1abfade022d8ec1d71df72f6a75613e3a3a021f18c47a18a1895 + Stored in directory: /home/pdxjohnny/.cache/pip/wheels/90/74/0a/e42fc6338ed1604a4b23fb4ebd4c1c7c7ae716f0ecbbe6fb14 + Building wheel for hio (setup.py) ... done + Created wheel for hio: filename=hio-0.6.7-py3-none-any.whl size=97821 sha256=c8ab55b918d13057109de99a475c729fd6b8ef9cc249e01a933ca88156cd357f + Stored in directory: /home/pdxjohnny/.cache/pip/wheels/9f/a0/f7/8696eba689852f5f33237d5e67a5f71a6b084e3df25dc7080d +Successfully built keri daemonocle hio +Installing collected packages: hjson, psutil, ordered-set, multidict, multicommand, mnemonic, jsonschema, falcon, click, hio, daemonocle, apispec, keri +Successfully installed apispec-6.0.0 click-8.1.3 daemonocle-1.2.3 falcon-3.1.0 hio-0.6.7 hjson-3.1.0 jsonschema-4.16.0 keri-0.6.7 mnemonic-0.20 multicommand-1.0.0 multidict-6.0.2 ordered-set-4.1.0 psutil-5.9.3 +``` + +- References + - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/components/DIDMemeCreator.tsx#L59 + - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/core/DIDMeme/index.ts + - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/core/ipfs.ts + - https://github.com/desudesutalk/f5stegojs#cli-tool + - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/components/DIDMemeCreator.tsx#L42** + - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/components/DIDMemeCreator.tsx#L157 + - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/components/WalletCreator.tsx#L20-L70 +- TODO + - [ ] Read https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/alice-attempts-abuse-verifiable-credential.pdf + - [ ] 2nd party infra + - [ ] Stream of consciousness + - [ ] GitHub actions webhook enable Stream of consciousness in repo setting then will dispatch workflows via stream of consciousness path logic reading trigger filtering based on `on.push.paths` + - [ ] Could use DID entry points as paths to signal workflow should be triggered on that event + - Could get down to operation granularity referenced inside flows for given event stream s. + - Example: `paths: ["did:alice:shouldi:contribute:clone_git_repo:ouputs.repo"]` + - Through workflow inspect we can expose this as an overlay + - It can be advertised to the stream of consciousness that this workflow should be dispatched, if the overlay is enabled \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0063/index.md b/docs/discussions/alice_engineering_comms/0063/index.md new file mode 100644 index 0000000000..75daa0b9c8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0063/index.md @@ -0,0 +1 @@ +# 2022-10-22 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0063/reply_0000.md b/docs/discussions/alice_engineering_comms/0063/reply_0000.md new file mode 100644 index 0000000000..d41bc83af6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0063/reply_0000.md @@ -0,0 +1,3 @@ +- Developer yellow brick road to critical velocity + - search engineering logs for other refs +- Use automl PRs from Edison to issue cobteacts for evaluation of hyperparamets as dataflow / operation / manifest instance (DID based encoded). automl then auto feature engineerinh \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0064/index.md b/docs/discussions/alice_engineering_comms/0064/index.md new file mode 100644 index 0000000000..12afcecb73 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0064/index.md @@ -0,0 +1 @@ +# 2022-10-23 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0064/reply_0000.md b/docs/discussions/alice_engineering_comms/0064/reply_0000.md new file mode 100644 index 0000000000..9c01bbe5de --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0064/reply_0000.md @@ -0,0 +1,5 @@ +## 2022-10-23 @pdxjohnny Engineering Logs + +- https://github.com/transmute-industries/did-jwk-pqc + - Orie coincidentally posted he’s working on didme.me v2 which will use post quantum json web keys. + - John to pursue container image registery side of previous idea. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0065/index.md b/docs/discussions/alice_engineering_comms/0065/index.md new file mode 100644 index 0000000000..853ef6c803 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0065/index.md @@ -0,0 +1 @@ +# 2022-10-24 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0065/reply_0000.md b/docs/discussions/alice_engineering_comms/0065/reply_0000.md new file mode 100644 index 0000000000..837a50d9b7 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0065/reply_0000.md @@ -0,0 +1,41 @@ +# Rolling Alice: Architecting Alice: An Image + +> Moved to: https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0007_an_image.md + +- In relation to the manifest encoded as a "screenshot as universal API" + - https://twitter.com/mattrickard/status/1577321709350268928 + - https://twitter.com/David3141593/status/1584462389977939968 + - > TIL python's pip will execute a setup .py directly from a ZIP archive from a web URL, with mime sniffing. This allows for a nice lolbin oneliner, with payload hosted on Twitter's CDN (or anywhere else really) `$ pip install "https://pbs"."twimg"."com/media/Ff0iwcvXEAAQDZ3.png"` (or $ pip install https://t"."co/uPXauf8eTg`) + > ![image](https://user-images.githubusercontent.com/5950433/197549602-f1f98e38-5f34-4d04-b64c-94d49264d189.png) + > ![source_code zip](https://user-images.githubusercontent.com/5950433/197549941-b915f643-4c29-4442-bf88-2a1ad604e877.png) + - Sounds like we finally have ourselves a reliable distribution mechanism! :) + - need parity with text as universal API + - screenshots as operations + - YAML for dataflow + - encourages short functions :P + - Everything effectively a manifest instance, operation plus metadata + - https://satori-syntax-highlighter.vercel.app/ + - https://twitter.com/shuding_/status/1581358324569645056 + - https://satori-syntax-highlighter.vercel.app/api/highlighter?code=let%20alice%20%3D%20new%20Alice()&background=%23E36FB7&lang=js&fontSize=16 + - https://pypi.org/project/svglib/ + - https://github.com/deeplook/svglib/blob/9472e067d88920debfbf6daefed32045025bf039/scripts/svg2pdf#L36-L45 + - https://github.com/deeplook/svglib/blob/9472e067d88920debfbf6daefed32045025bf039/svglib/svglib.py#L1402-L1414 + - https://github.com/deeplook/svglib/blob/9472e067d88920debfbf6daefed32045025bf039/svglib/svglib.py#L1438-L1447 + - It's just a screenshot of code + - You just take a bunch of screenshots and put them together and that's your overlays + - You can always trampoline and use one as a manifest or wrapper to resolution via a next phase storage medium. + - didme.mev2 + - https://github.com/transmute-industries/did-jwk-pqc +- https://twitter.com/amasad/status/1584327997695283200/photo/1 +- We'll proxy the registry off all these images + +```console +$ curl -sfL "https://satori-syntax-highlighter.vercel.app/api/highlighter?code=let%20alice%20%3D%20new%20Alice()&background=%23E36FB7&lang=js&fontSize=16" | +``` + +- Future + - Streaming? Solved! Video streaming APIs :P + - Generate an image of Alice with all her source code packaged + - pip install of image + - Eventually generate videos + - Container registry service endpoint can build container images or manifest images / instances \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0065/reply_0001.md b/docs/discussions/alice_engineering_comms/0065/reply_0001.md new file mode 100644 index 0000000000..0cd0b7d4f4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0065/reply_0001.md @@ -0,0 +1,259 @@ +## 2022-10-24 @pdxjohnny Engineering Logs + +- https://medium.com/mlearning-ai/enter-the-world-of-diffusion-models-4485fb5c5986 +- https://github.com/martinthomson/i-d-template +- https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0007_an_image.md + - Future + - Lossy encoded software DNA transmitted via ad-hoc formed webrtc channels with data / component provenance encoded in-band (maybe SCITT receipts). Context aware collective intelligence is then enabled to iterate at high speed within conceptual impact bounds per group agreed policy. + - Or multicast ;P + - ![spaceballs-ludicous-speed](https://user-images.githubusercontent.com/5950433/197626110-69a6f9a3-9e2c-45fa-8ecc-784232c8e868.gif) +- https://twitter.com/pdxjohnny/status/1584657901414928385 + - https://asciinema.org/a/531762 + +[![asciicast](https://asciinema.org/a/531762.svg)](https://asciinema.org/a/531762) + +- https://www.nps.gov/neri/planyourvisit/the-legend-of-john-henry-talcott-wv.htm + - "If I can't beat this steam drill down, I'll die with this hammer in my hand!" [John Henry] + +### Rolling Alice: Architecting Alice: An Image + +- References + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0007_an_image.md + - https://github.com/CleasbyCode/pdvzip + - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst + - https://satori-syntax-highlighter.vercel.app/api/highlighter?fontSize=4&lang=python&background=%23E36FB7&code=%22%22%22%0AUsage%0A%2A%2A%2A%2A%2A%0A%0A%2A%2ATODO%2A%2A%0A%0A-%20Packaging%0A%0A..%20code-block%3A%3A%20console%0A%0A%20%20%20%20%24%20echo%20Package%20python%20into%20wheel%20given%20entry%20points%20to%20overlay%20dffml.overlays.alice.please.contribute.recommended_community_standards%0A%20%20%20%20%24%20echo%20Embed%20JWK%0A%20%20%20%20%24%20echo%20JWK%20fulcio%20OIDC%3F%0A%20%20%20%20%24%20echo%20upload%20to%20twitter%20or%20somewhere%0A%20%20%20%20%24%20echo%20download%20and%20verify%20using%20JWK%2C%20show%20OIDC%20for%20online%20lookup%0A%20%20%20%20%24%20pip%20install%20package.zip%0A%20%20%20%20%24%20alice%20shouldi%20contribute%20-log%20debug%20-keys%20https%3A%2F%2Fexamples.com%2Frepowith%2Fmyconfigjson%0A%0A%22%22%22%0Aimport%20json%0Aimport%20pathlib%0Afrom%20typing%20import%20NewType%0A%0AMyConfig%20%3D%20NewType%28%22MyConfig%22%2C%20object%29%0AMyConfigUnvalidated%20%3D%20NewType%28%22MyConfigUnvalidated%22%2C%20object%29%0AMyConfigProjectName%20%3D%20NewType%28%22MyConfigProjectName%22%2C%20str%29%0AMyConfigDirectory%20%3D%20NewType%28%22MyConfigDirectory%22%2C%20str%29%0A%0A%0Adef%20read_my_config_from_directory_if_exists%28%0A%20%20%20%20directory%3A%20MyConfigDirectory%2C%0A%29%20-%3E%20MyConfigUnvalidated%3A%0A%20%20%20%20%22%22%22%0A%20%20%20%20%3E%3E%3E%20import%20json%0A%20%20%20%20%3E%3E%3E%20import%20pathlib%0A%20%20%20%20%3E%3E%3E%20import%20tempfile%0A%20%20%20%20%3E%3E%3E%0A%20%20%20%20%3E%3E%3E%20with%20tempfile.TemporaryDirectory%28%29%20as%20tempdir%3A%0A%20%20%20%20...%20%20%20%20%20_%20%3D%20pathlib.Path%28tempdir%2C%20%22.myconfig.json%22%29.write_text%28json.dumps%28%7B%22name%22%3A%20%22Hello%20World%22%7D%29%29%0A%20%20%20%20...%20%20%20%20%20print%28read_my_config_from_directory_if_exists%28tempdir%29%29%0A%20%20%20%20%7B%27name%27%3A%20%27Hello%20World%27%7D%0A%20%20%20%20%22%22%22%0A%20%20%20%20path%20%3D%20pathlib.Path%28directory%2C%20%22.myconfig.json%22%29%0A%20%20%20%20if%20not%20path.exists%28%29%3A%0A%20%20%20%20%20%20%20%20return%0A%20%20%20%20return%20json.loads%28path.read_text%28%29%29%0A%0A%0Adef%20validate_my_config%28%0A%20%20%20%20config%3A%20MyConfigUnvalidated%2C%0A%29%20-%3E%20MyConfig%3A%0A%20%20%20%20%23%20TODO%28security%29%20json%20schema%20valiation%20of%20myconfig%20%28or%0A%20%20%20%20%23%20make%20done%20automatically%20by%20operation%20manifest%20schema%0A%20%20%20%20%23%20validation%20on%20InputNetwork%2C%20maybe%2C%20just%20one%20option%2C%0A%20%20%20%20%23%20or%20maybe%20similar%20to%20how%20prioritizer%20gets%20applied%2C%0A%20%20%20%20%23%20or%20maybe%20this%20is%20an%20issue%20we%20already%20track%3A%20%231400%29%0A%20%20%20%20return%20config%0A%0A%0Adef%20my_config_project_name%28%0A%20%20%20%20config%3A%20MyConfig%2C%0A%29%20-%3E%20MyConfigProjectName%3A%0A%20%20%20%20%22%22%22%0A%20%20%20%20%3E%3E%3E%20print%28my_config_project_name%28%7B%22name%22%3A%20%22Hello%20World%22%7D%29%29%0A%20%20%20%20Hello%20World%0A%20%20%20%20%22%22%22%0A%20%20%20%20return%20config%5B%22name%22%5D%0A + - `$ python -c 'import sys, urllib.parse; sys.stdout.write(urllib.parse.quote(sys.stdin.read(), safe=""))'` + - Orie mentioned "Only twitter web client works for PNGs and they have to be under 900 pixels." + - https://twitter.com/OR13b/status/1584669807827648512?s=20&t=Xec9v05emwSphzT6W0R8PA + - https://github.com/ossf/scorecard/blob/main/options/flags.go + +```console +$ git clone https://github.com/CleasbyCode/pdvzip +$ cd pdvzip/ && $ g++ pdvzip.cpp -o pdvzip +$ dffml service dev create blank alice-shouldi-contribute-openssf-scorecard +$ cd alice-shouldi-contribute-openssf-scorecard +$ sed -i 's/zip_safe = False/zip_safe = True/' setup.cfg +$ sed -i 's/# entry_points/entry_points/' setup.cfg +$ echo -e '[dffml.overlays.alice.shouldi.contribute]\nOpenSSFScorecard = alice_shouldi_contribute_openssf_scorecard.operations' | tee entry_points.txt +``` + +**alice_shouldi_contribute_openssf_scorecard/operations.py** + +```python +""" +Usage +***** + +**TODO** + +- Packaging + +.. code-block:: console + + $ echo Package python into wheel given entry points to overlay dffml.overlays.alice.please.contribute.recommended_community_standards + $ echo Embed JWK + $ echo JWK fulcio OIDC? + $ echo upload to twitter or somewhere + $ echo download and verify using JWK, show OIDC for online lookup + $ pip install package.zip + $ alice shouldi contribute -log debug -keys https://examples.com/repowith/myconfigjson + +""" +import os +import json +import pathlib +import platform +import contextlib +from typing import Dict, NewType + +import dffml +import dffml_feature_git.feature.definitions + + +@dffml.config +class EnsureScorecardConfig: + cache_dir: pathlib.Path = dffml.field( + "Cache directory to store downloads in", + default_factory=lambda: pathlib.Path(os.getcwd()), + ) + platform_urls: Dict[str, Dict[str, str]] = dffml.field( + "Mapping of platform.system() return values to scorecard download URLs with hashes", + default_factory=lambda: { + "Linux": { + "url": "https://github.com/ossf/scorecard/releases/download/v4.8.0/scorecard_4.8.0_linux_amd64.tar.gz", + "expected_hash": "8e90236b3e863447fc98f6131118cd1f509942f985f30ba02825c5d67f2b9999f0ac5aa595bb737ef971788c48cd20c9", + }, + }, + ) + + +OpenSSFScorecardBinaryPath = NewType("OpenSSFScorecardBinaryPath", str) + + +@dffml.op( + config_cls=EnsureScorecardConfig, imp_enter={"stack": contextlib.AsyncExitStack,}, +) +async def ensure_scorecard(self) -> OpenSSFScorecardBinaryPath: + scorecard = await dffml.cached_download_unpack_archive( + **{ + "file_path": self.parent.config.cache_dir.joinpath("scorecard.tar.gz"), + "directory_path": self.parent.config.cache_dir.joinpath("scorecard-download"), + # Use whatever values are appropriate for the system we are on + **self.parent.config.platform_urls[platform.system()], + } + ) + self.parent.stack.enter_context(dffml.prepend_to_path(scorecard)) + binary_path = list(scorecard.glob("scorecard*"))[0].resolve() + return binary_path + + +# TODO https://koxudaxi.github.io/datamodel-code-generator/ from schema +OpenSSFScorecardResults = NewType("OpenSSFScorecardResults", dict) + + +@dffml.op +async def openssf_scorecard( + self, + scorecard_path: OpenSSFScorecardBinaryPath, + repo: dffml_feature_git.feature.definitions.git_repository, +) -> OpenSSFScorecardResults: + cmd = [ + scorecard_path, + "--format=json", + f"--local={repo.directory}" + ] + async for event, result in dffml.run_command_events( + cmd, + cwd=repo.directory, + env={ + **os.environ, + }, + events=[dffml.Subprocess.STDOUT], + logger=self.logger, + ): + return json.loads(result.decode()) + +``` + +```conole +$ pip install -e . +$ dffml service dev entrypoints list dffml.overlays.alice.shouldi.contribute +OpenSSFScorecard = alice_shouldi_contribute_openssf_scorecard.operations -> alice-shouldi-contribute-openssf-scorecard 0.1.dev1+g614cd2a.d20221025 (/home/coder/.local/lib/python3.9/site-packages) +$ alice -log debug shouldi contribute -keys https://${GH_ACCESS_TOKEN}@github.com/pdxjohnny/httptest +DEBUG:dffml.MemoryOperationImplementationNetworkContext:Instantiating operation implementation alice_shouldi_contribute_openssf_scorecard.operations:ensure_scorecard(alice_shouldi_contribute_openssf_scorecard.operations:ensure_scorecard) with default config: EnsureScorecardConfig(cache_dir=PosixPath('/tmp/tmp.hgZT8hhxqR/didme.me/pdvzip/alice-shouldi-contribute-openssf-scorecard'), platform_urls={'Linux': {'url': 'https://github.com/ossf/scorecard/releases/download/v4.8.0/scorecard_4.8.0_linux_amd64.tar.gz', 'expected_hash': '8e90236b3e863447fc98f6131118cd1f509942f985f30ba02825c5d67f2b9999f0ac5aa595bb737ef971788c48cd20c9'}}) +DEBUG:dffml.AliceShouldiContributeOpenssfScorecardOperations:EnsureScorecardImplementation:EnsureScorecardConfig(cache_dir=PosixPath('/tmp/tmp.hgZT8hhxqR/didme.me/pdvzip/alice-shouldi-contribute-openssf-scorecard'), platform_urls={'Linux': {'url': 'https://github.com/ossf/scorecard/releases/download/v4.8.0/scorecard_4.8.0_linux_amd64.tar.gz', 'expected_hash': '8e90236b3e863447fc98f6131118cd1f509942f985f30ba02825c5d67f2b9999f0ac5aa595bb737ef971788c48cd20c9'}}) +``` + +- It's running the `ensure_scorecard` but not the scan. + +```console +$ dffml service dev export alice.cli:ALICE_COLLECTOR_DATAFLOW | tee alice_shouldi_contribute.json +$ dffml dataflow diagram alice_shouldi_contribute.json | tee alice_shouldi_contribute.mmd +``` + +- Found that we are using `dffml_feature_git.feature.definitions` + - Rather than we had first tried `AliceGitRepo`, we need to update the shouldi code to have Alice specifics. + + +```console +$ alice -log debug shouldi contribute -keys https://${GH_ACCESS_TOKEN}@github.com/pdxjohnny/httptest +Traceback (most recent call last): + File "/src/dffml/dffml/df/memory.py", line 1291, in run_dispatch + outputs = await self.run( + File "/src/dffml/dffml/df/memory.py", line 1256, in run + return await self.run_no_retry(ctx, octx, operation, inputs) + File "/src/dffml/dffml/df/memory.py", line 1233, in run_no_retry + outputs = await opctx.run(inputs) + File "/src/dffml/dffml/df/base.py", line 547, in run + result = await result + File "/tmp/tmp.hgZT8hhxqR/didme.me/pdvzip/alice-shouldi-contribute-openssf-scorecard/alice_shouldi_contribute_openssf_scorecard/operations.py", line 64, in openssf_scorecard + async for event, result in dffml.run_command_events( + File "/src/dffml/dffml/util/subprocess.py", line 83, in run_command_events + raise RuntimeError( +RuntimeError: [PosixPath('/tmp/tmp.hgZT8hhxqR/didme.me/pdvzip/alice-shouldi-contribute-openssf-scorecard/scorecard-download/scorecard-linux-amd64'), '--format=json', '--local=/tmp/dffml-feature-git-ly4u_eds']: Error: check runtime error: Dependency-Update-Tool: internal error: Search: unsupported feature +{"date":"2022-10-25","repo":{"name":"file:///tmp/dffml-feature-git-ly4u_eds","commit":"unknown"},"scorecard":{"version":"v4.8.0","commit":"c40859202d739b31fd060ac5b30d17326cd74275"},"score":6.8,"checks":[{"details":null,"score":10,"reason":"no dangerous workflow patterns detected","name":"Dangerous-Workflow","documentation":{"url":"https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#dangerous-workflow","short":"Determines if the project's GitHub Action workflows avoid dangerous patterns."}},{"details":null,"score":-1,"reason":"internal error: Search: unsupported feature","name":"Dependency-Update-Tool","documentation":{"url":"https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#dependency-update-tool","short":"Determines if the project uses a dependency update tool."}},{"details":null,"score":10,"reason":"license file detected","name":"License","documentation":{"url":"https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#license","short":"Determines if the project has defined a license."}},{"details":null,"score":9,"reason":"dependency not pinned by hash detected -- score normalized to 9","name":"Pinned-Dependencies","documentation":{"url":"https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#pinned-dependencies","short":"Determines if the project has declared and pinned the dependencies of its build process."}},{"details":null,"score":0,"reason":"non read-only tokens detected in GitHub workflows","name":"Token-Permissions","documentation":{"url":"https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#token-permissions","short":"Determines if the project's workflows follow the principle of least privilege."}}],"metadata":null} +2022/10/25 00:30:47 error during command execution: check runtime error: Dependency-Update-Tool: internal error: Search: unsupported feature + + +The above exception was the direct cause of the following exception: + +Traceback (most recent call last): + File "/home/coder/.local/bin/alice", line 8, in + sys.exit(AliceCLI.main()) + File "/src/dffml/dffml/util/cli/cmd.py", line 286, in main + result = loop.run_until_complete(cls._main(*argv[1:])) + File "/.pyenv/versions/3.9.13/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete + return future.result() + File "/src/dffml/dffml/util/cli/cmd.py", line 252, in _main + return await cls.cli(*args) + File "/src/dffml/dffml/util/cli/cmd.py", line 238, in cli + return await cmd.do_run() + File "/src/dffml/dffml/util/cli/cmd.py", line 215, in do_run + return [res async for res in self.run()] + File "/src/dffml/dffml/util/cli/cmd.py", line 215, in + return [res async for res in self.run()] + File "/src/dffml/dffml/cli/dataflow.py", line 287, in run + async for record in self.run_dataflow( + File "/src/dffml/dffml/cli/dataflow.py", line 272, in run_dataflow + async for ctx, results in octx.run( + File "/src/dffml/dffml/df/memory.py", line 1713, in run + raise exception + File "/src/dffml/dffml/df/memory.py", line 1881, in run_operations_for_ctx + raise OperationException( +dffml.df.base.OperationException: alice_shouldi_contribute_openssf_scorecard.operations:openssf_scorecard({'scorecard_path': OpenSSFScorecardBinaryPath, 'repo': git_repository}): {'scorecard_path': PosixPath('/tmp/tmp.hgZT8hhxqR/didme.me/pdvzip/alice-shouldi-contribute-openssf-scorecard/scorecard-download/scorecard-linux-amd64'), 'repo': GitRepoSpec(directory='/tmp/dffml-feature-git-ly4u_eds', URL='https://@github.com/pdxjohnny/httptest')} +$ python -c 'import yaml, json,sys; print(yaml.dump(json.loads(sys.stdin.read())))' < error.json +``` + +```yaml +checks: +- details: null + documentation: + short: Determines if the project's GitHub Action workflows avoid dangerous patterns. + url: https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#dangerous-workflow + name: Dangerous-Workflow + reason: no dangerous workflow patterns detected + score: 10 +- details: null + documentation: + short: Determines if the project uses a dependency update tool. + url: https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#dependency-update-tool + name: Dependency-Update-Tool + reason: 'internal error: Search: unsupported feature' + score: -1 +- details: null + documentation: + short: Determines if the project has defined a license. + url: https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#license + name: License + reason: license file detected + score: 10 +- details: null + documentation: + short: Determines if the project has declared and pinned the dependencies of its + build process. + url: https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#pinned-dependencies + name: Pinned-Dependencies + reason: dependency not pinned by hash detected -- score normalized to 9 + score: 9 +- details: null + documentation: + short: Determines if the project's workflows follow the principle of least privilege. + url: https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#token-permissions + name: Token-Permissions + reason: non read-only tokens detected in GitHub workflows + score: 0 +date: '2022-10-25' +metadata: null +repo: + commit: unknown + name: file:///tmp/dffml-feature-git-ly4u_eds +score: 6.8 +scorecard: + commit: c40859202d739b31fd060ac5b30d17326cd74275 + version: v4.8.0 +``` + +- TODO + - [ ] Portrait screenshots? + - [ ] Split into two screenshots, one upstream, one overlay + - [ ] Another screenshot serving as their manifest to do both \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0066/index.md b/docs/discussions/alice_engineering_comms/0066/index.md new file mode 100644 index 0000000000..d3d4de565a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0066/index.md @@ -0,0 +1,135 @@ +# 2022-10-25 Engineering Logs + +- [ ] Cleanup progress report transcripts and post within Architecting Alice as numbered files 0000_ +- [ ] GitHub Container Registry or Digital Ocean space or something as registry with static content? + - https://github.com/MrE-Fog/static-container-registry +- [ ] Stream of Consciousness to trigger downstream rebuilds + - https://github.com/intel/dffml/pull/1420 + - Ensure we show at least one downstream rebuild + - `dffml` + - `dffml[all]` + - Future + - Enable downstream events for builds of different tags / layers + within existing dockerfiles and push them (if intermediate rebuilt). +- [ ] Fix DFFML CI + - https://github.com/intel/dffml/actions/runs/3318045403 + - Not looking good... + - https://github.com/intel/dffml/pull/1420 +- [ ] Fix Alice CI +- [ ] 2ndparty +- [ ] RFCv2 +- [ ] Call for contribution again +- [ ] Alice on chain + - [ ] https://github.com/intel/dffml/discussions/1369#discussioncomment-2683370 + - [ ] Distributed system context store: web3 + manifests + - [ ] Wonderland: The nickname we give the collective mass of thoughts in existence. This all the data in Alice on chain. + - [ ] https://github.com/intel/dffml/issues/1377 +- [x] Dataflow as class +- [ ] add the dataflow we executed to the chain. The next execution it should load data from some location via overlay to add this top level system context to the hostory of executed contexts. And the top level context should be linked both ways to the orignal external inputs (UCAN?) +- [ ] Cached flows to did chain then to backing storage via default input network as dataflow that does this to did in background. Start with json so they get saved to file. Add identity as input to top level context. Identiy could have parent input objects. such as this is of definition github username, which you could then have an operation that takes github usernames and outputs their SPDXIDs. When that operation SPDXID output is run through the deafult DID input network, a strategic plan (default overlayed dataflow to the default input network) which does this forking stuff. Could have location for user overlays in .local or something. When a context is thought of or hypothesised or executed it will be in the user context herstory. Users can optionally add overlays to their default flows (kind of like systemd). This could enable a user to overlay if im worjing within this cwd for this top level system cobtext run these commands. Alice as shell + - [ ] long term: fork to save to chain on process exit (can we fork or coredump somehow on atexit?) by default. +- [ ] cve bin tool checker from chain +- [ ] https://gitbom.dev/ +- [ ] Fix TODO on watching new contexts in memory orchestrator OR maybe this is fixed via the seperate linage? Probably needs event filtration similar to run_command so by default if not set in kwargs only +- [ ] Operations and their config as inputs + - [ ] Unify typing via parent type / primitive as Input parents + - [ ] Can have operations that filter and old let through Input objects with specific parents or parents in specific order + - [ ] The config dataflow, the startup on is the same as this new instantiate operations from Input objects. We can add shared config becomes a bunch of input objects. We have something like flow. ‘config_flow’ maybe which is where we’ll do initialization. Actually, lets just re use the main execution. Instantiate operations via an operation that instantiates them. We can then for each operation, use our newfound input filtering operations to form appropriate dependency graphs on order of instantiatation and usage of config objects (when executing in this top level context) we can then pass config and shared config as input objects to build config classes with references to same underlying data in memory. This solves shared config #720 + - [ ] Locality + - [ ] Operation name + - [ ] Stub values added as parents to outputs. Structured logs from an operation added as parents to operation outputs +- [ ] Use newfound operations and inputs with stub values +- [ ] Run an overlayed flow with output operations to build c4models of our dataflow based on parent input analysis. Generate architecture diagrams from it. +- [ ] Unify type system with Python’s type system via newfound input parent chains (#188) +- [ ] prioritizer + - [ ] statigic plans (similar to dataflow as class method output grabbers) + - [ ] gatekeeper +- [ ] Inventory +- [ ] Creation based on datatypes + - [ ] Input to dataclass field mappings + - [ ] Quicker syntax for dataflow definition +- [ ] Have strategic plan models predict what inputs and outputs will exist to reach desired output metrics + - [ ] Alice create threat model of code base + - [ ] strategic plan for threat model completeness + - [ ] keeps suggesting new system contexts, or incentivizing creation of new system contexts by other strategic plans so as to drive up completeness metric + - [ ] New contexts are created by finding different sets of operations connected differently via flow modifications where applicable + - [ ] There new contexts are run through a validity check to ensure all inputs to operations are consumed and all outputs are consumed by strategic plans somewhere. + - [ ] Provide functionality to audit unused output values. + - [ ] Gatekeeper and prioritizer models help decide what gets run and when. + - [ ] top level system context we are executing in takes an input completeness for an organizationally applied strategic plan. Likely this completeness is a situation where we have a property of an `@config` which maps to a definition with something to do with completeness. + - [ ] Target example around DFFML itself and it's development, and other OSS libs + +--- + +system context includes + +- I/O + - Any cached values +- Prioritizer + - Strategic plans + - Some agents will not work with you unless they can run a strategic plan across a system context they are given to to execute to ensure that the system context has active provenance information that tells them to their desired level of assurance (trusted party vouch, attestation as an option) + - We need to log which plans we execute as a part of the prioritizer using structured metrics or as an output of some kind + - Gatekeeper +- Dataflow + +--- + +### Note + +- If you don't make a threat model, your attacker will make it for you. Daisy she thinks about making but then the rabbit is more interesting and now were down the hole. oops too late, should have made the threat model first. Let's hurry up and make it quickly before we get too deep into Wonderland. +- shouldi, wonder about installing packages. Explain how that increases threat surface. +- write about how we extended shouldi and go into technical details. +- Building markdown docs with mermaid diagrams + +--- + +## Living THREATS.md + +Install Alice https://github.com/intel/dffml/tree/alice/entities/alice + +Create the `THREATS.md` file + +```console +$ alice threats \ + -inputs \ + models/good.json=ThreatDragonThreatModelPath \ + models/GOOD_THREATS.md=ThreatsMdPath +``` + +We made `auditor_overlay.py` which is a data flow which calls the auditor. We +use `sed` to direct the data flow to run on the path to the threat model from +Threat Dragon used as input. + +```console +$ dffml service dev export auditor_overlay:AUDITOR_OVERLAY \ + -configloader yaml \ + | sed -e 's/auditor_overlay:audit.inputs.ltm/ThreatDragonThreatModelPath/g' \ + | tee auditor_overlay.yaml +``` + +Generate `GOOD_THREATS.md` with auditing overlay. + +```console +$ alice threats -log debug \ + -overlay auditor_overlay.yaml \ + -inputs \ + models/good.json=ThreatDragonThreatModelPath \ + models/GOOD_THREATS.md=ThreatsMdPath +``` + +Generate `BAD_THREATS.md` with auditing overlay. + +```console +$ alice threats -log debug \ + -overlay auditor_overlay.yaml \ + -inputs \ + models/bad.json=ThreatDragonThreatModelPath \ + models/BAD_THREATS.md=ThreatsMdPath +``` + +Dump out to HTTP to copy to GitHub for rendering. + +```console +$ (echo -e 'HTTP/1.0 200 OK\n' && cat models/GOOD_THREATS.md) | nc -Nlp 9999; +$ (echo -e 'HTTP/1.0 200 OK\n' && cat models/BAD_THREATS.md) | nc -Nlp 9999; +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0066/reply_0000.md b/docs/discussions/alice_engineering_comms/0066/reply_0000.md new file mode 100644 index 0000000000..39fa5d43f8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0066/reply_0000.md @@ -0,0 +1,17 @@ +## 2022-10-25 @pdxjohnny Engineering Logs + +- https://twitter.com/hardmaru/status/1584731173426954241 + - > Backprop is just another “hand-engineered” feature + - grep discussion for more details +- Sourced today's team log from https://github.com/intel/dffml/commit/208ac457b378aab86d28775d0f10d0bc25b0a212#diff-986012018712addda9630dba0adf9035e6f8aae84e4410390f99cbc5618c574e +- stream of contsiouness enable gitops for entities (agents, humans, etc.) config for their background listenting notifiaction prefs + - Like a robots.txt for should you notify me, same as we are doing with the plugins +- https://github.com/jurgisp/memory-maze + - https://twitter.com/danijarh/status/1584893538180874241 +- Future + - Expand upon [Volume 1: Chapter 1: Down the Dependency Rabbit-Hole Again](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md) to add dynamic analysis, aka tell me what the delta on CI env is. +- Misc people to circle back with + - John Whiteman was planning on writing collectors and analyzing AST + - Michael could help us generate PDFs from Sphinx sites +- https://twitter.com/OR13b/status/1584975480889147392 + - Need to dig into this and why entityType got the banhammer \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0066/reply_0001.md b/docs/discussions/alice_engineering_comms/0066/reply_0001.md new file mode 100644 index 0000000000..de0790c48e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0066/reply_0001.md @@ -0,0 +1,24 @@ +## 2022-10-25 Alice Initiative welcome aboard! + +- Harsh joining us to do some Python package analysis work +- Alice thread: https://github.com/intel/dffml/discussions/1406?sort=new +- This work feeds into the following tutorial + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md +- [shouldi: deptree: Create dependency tree of project · Issue #596 · intel/dffml](https://github.com/intel/dffml/issues/596) + - https://github.com/intel/dffml/commits/shouldi_dep_tree + - > The idea behind the work that was done so far in the above branch was to produce the full dependency tree for a given python package. +- Documentation writing process + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0004_writing_the_wave.md#vision +- Contributing Documentation + - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst +- Troubleshooting — DFFML fd401e426 documentation + - https://intel.github.io/dffml/main/troubleshooting.html#entrypointnotfound +- Next steps + - Harsh will first focus on filling out the other two functions with unit tests for different file contents + - These functions / files can be standalone at first, we can integrate later. + - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst#writing-an-overlay + - Harsh to ping John as needed. + - Harsh to comment in issue with commands run and errors and so forth so we can copy pate into the associated tutorial later. + - Plans for automation of documentation writing: https://github.com/intel/dffml/commit/74781303fae19b03326878d184a49ac93543749c?short_path=76e9bfe#diff-76e9bfe1c05d4426559fada22595ca1f9a76fd0fc98609dfbbde353d10fa77db + +https://github.com/intel/dffml/blob/0a2e053f5f8e361054f329a3f763982fb1e4d1f7/examples/shouldi/tests/test_dep_tree.py#L36-L71 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0067/index.md b/docs/discussions/alice_engineering_comms/0067/index.md new file mode 100644 index 0000000000..91a5031241 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0067/index.md @@ -0,0 +1,9 @@ +# 2022-10-26 Engineering Logs + +- https://en.m.wikipedia.org/wiki/Knowledge_argument + - `alias Alice=Mary` + - grep + - fourth eye 👁️ + - Scientific process + +TODO Alice gif for black and white to color (the acquisition of the fourth eye, when she steps through the looking glass) diff --git a/docs/discussions/alice_engineering_comms/0067/reply_0000.md b/docs/discussions/alice_engineering_comms/0067/reply_0000.md new file mode 100644 index 0000000000..8166345ddf --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0067/reply_0000.md @@ -0,0 +1,28 @@ +## 2022-10-26 @sedihglow Engineering Logs + +- https://github.com/sedihglow/red_black_tree +- https://gist.github.com/sedihglow/770ed4e472935c5ab302d069b64280a8 + - How Python's builtin `sorted()` works + - https://docs.python.org/3/library/functions.html#sorted +- References + - http://www.microhowto.info/howto/convert_from_html_to_formatted_plain_text.html + - `$ lynx -dump -display_charset UTF-8 "https://docs.docker.com/engine/install/ubuntu/"` + - https://unix.stackexchange.com/questions/336253/how-to-find-gnome-terminal-currently-used-profile-with-cmd-line + - `--save-config` has been removed +- Docker + - https://github.com/pdxjohnny/dockerfiles/blob/406f0b94838f7dcd1792c394061a2ee18c4f7487/sshd/Dockerfile +- https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst#cloning-the-repo +- Vim + - Exit insert mode `Ctrl-[` + +```console +$ git clone -b alice https://github.com/intel/dffml +$ cd dffml/entities/alice +$ python -m pip install \ + -e .[dev] \ + -e ../../ \ + -e ../../examples/shouldi/ \ + -e ../../feature/git/ \ + -e ../../operations/innersource/ \ + -e ../../configloader/yaml/ +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0067/reply_0001.md b/docs/discussions/alice_engineering_comms/0067/reply_0001.md new file mode 100644 index 0000000000..efb7b8d2db --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0067/reply_0001.md @@ -0,0 +1,16 @@ +## 2022-10-26 @pdxjohnny Engineering Logs + +- https://github.com/intel/dffml/pull/1420 +- https://en.m.wikipedia.org/wiki/Knowledge_graph +- https://github.com/peacekeeper/uni-resolver-driver-did-example +- https://medium.com/transmute-techtalk/neo4j-graph-data-science-with-verifiable-credential-data-98b806f2ad78 +- with regards to thought arbitrage + - Decentralised Finance and Automated Market Making: Execution and Speculation + - https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4144743 +- TPM + - https://0pointer.de/blog/brave-new-trusted-boot-world.html +- AutoML + - https://github.com/automl/TabPFN +- Updated Alice in CLI help, OS DecentrAlice sshd_banner, Google Drive AliceisHere, and here in this thread below. + +![alice-looking-up-no-shadow](https://user-images.githubusercontent.com/5950433/198141595-f7db1356-5446-49df-a0d7-731010fe1326.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0068/index.md b/docs/discussions/alice_engineering_comms/0068/index.md new file mode 100644 index 0000000000..cf016e6862 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0068/index.md @@ -0,0 +1,38 @@ +# 2022-10-27 Engineering Logs + +> Source: https://pdxjohnny.github.io/terminal-quickstart/ + +[![terminal-quickstart](https://github.com/pdxjohnny/pdxjohnny.github.io/raw/dev/static/images/terminal-quickstart.gif)](https://pdxjohnny.github.io/terminal-quickstart/) + +- So called "effective altruism movement" is not aligned + - What you are now is what you are becoming. + - Same goes for the collective. +- Example threat model scenario + - Imagine a software security researcher named Alice. + - Alice want wants to publicize her scientific research so + as to engage in discourse in the community and further + the [state of the art](https://en.wikipedia.org/wiki/State_of_the_art). + - Why she decided furthering the state of the art in field X + is out of scope for this scenario. It would have been + defined by reward mechanisms and the top level system + context's gatekeeper and priroritizer. Alice may in this situation also be a tenant attempting to escape the sandbox of her top level system context’s multi tenant environment, she (sum of parts, inputs within context) herself a context. + - Alice searches for communities to engage with, forums + chats, activity, any signs of life in the conceptual field + (the train of thought). + - Alice's query yields a malicious attacker controlled community. + - Acceleration in this community's train of thought is + measured to be outside of acceptable impact bounds to her values + / ethics / strategic principles and plans. She determines this by + predicting future state. + - How does Alice know that she should avoid working with + unaligned entities? How did she determine it was detrimental + to her strategic principles when viewed from lifecycle scope? + - Traversal of trust graphs! + - [2022-10-27 IETF SCITT Technical Meeting Notes](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3983087) + - https://github.com/intel/dffml/issues/1315 + - > Just think about it like 🐢 turtling in an RTS game or like being zen. You just don’t engage, you dont care, you’re focused with your alys in your ad hoc formed groups + - open source community cross talk / innersource: example set CNCF projects are aligned trees from similar roots. + - you look at other parts of your lifecycle to see how you can position yourself within the multi dimensional strategic field landscape which your top level strategic principles apply to within a context + - wardly maps we +- TODO + - [ ] analysis of kubernetes community handling of aligned events and community response to unaligned actors \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0068/reply_0000.md b/docs/discussions/alice_engineering_comms/0068/reply_0000.md new file mode 100644 index 0000000000..58943d6220 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0068/reply_0000.md @@ -0,0 +1,18 @@ +## 2022-10-27 @pdxjohnny Engineering Logs + +- Version Control Systems + - https://github.com/facebookexperimental/eden + - https://www.youtube.com/watch?v=bx_LGilOuE4&feature=youtu.be + - https://twitter.com/bernhardsson/status/1585652692701036544 +- Well I'll be, I forgot I already wrote a terminal quickstart doc until I accidently opened the attach file dialog and saw this gif I'd been meaning to add here. + - There is some stuff in this thread about teaching alice to use the shell. + - consoletest commands to graph nueral network markov chains? + - https://github.com/pdxjohnny/consoletest + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0003_a_shell_for_a_ghost.md + - Maybe we do this as a future tutorial to Architecting Alice: A Shell for A Ghost +- https://threadreaderapp.com/thread/1584623497284026368 + - https://indieweb.org/Micropub + - https://wordpress.org/plugins/indieauth/ + - https://indieweb.org/Micropub/Servers +- TODO + - [ ] DID resolver / proxy for https://github.com/facebookexperimental/eden \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0068/reply_0001.md b/docs/discussions/alice_engineering_comms/0068/reply_0001.md new file mode 100644 index 0000000000..42ad58efca --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0068/reply_0001.md @@ -0,0 +1,60 @@ +## 2022-10-27 IETF SCITT Technical + +- https://datatracker.ietf.org/wg/scitt/about/ +- https://github.com/ietf-scitt/scitt-web/blob/065ae3bf467e236d18774d954b5784d97c43ec17/_posts/2022-10-25-distributing-artifacts.md +- Zulip and Slack exists for IETF + - Comply with appropriate legal guidance + - Have fun creating channels an chatting otherwise + - Do not assume privacy, this is a hosted service. + - https://xkcd.com/1810/ + +![XKCD 1810: Chat Systems](https://user-images.githubusercontent.com/5950433/198354823-60c51c09-9644-4d1f-a434-9a474b2f5095.png) + +- Supply chain as a network of information that travels across an ecosystem + - Decentralization is natural in supply chains +- https://datatracker.ietf.org/meeting/upcoming + - See below +- Example flow / bare bones model + - When we need the software artifact it's available, it didn't change + - Need better tooling to keep copies in sync + - SCITT will be one of them + - Archiving + - Deployment logs + - Auditing for mitigation and upgrades +- How do we make sure that we never move the cheese on customers and they can roll forward and continue to take advantages of advancements in the future +- https://github.com/ietf-scitt/use-cases/blob/main/scitt-components.md + - More detailed view + - We can fill this out +- ACME Rockets + - Wabbit Networks from example can make internal information public easily + - They might have one SCITT instance that delivers + - They might have one SCITT instance that delivers provenance information to customers about released artifacts +- Each endpoint example: roy.azurecr.io + - Container Registry with signing aligned (azurecr means Azure Container Registry) + - Network boundries complicate permission models +- We need to iron out / document how to do transparent / clean replication + - petnames spec +- Orie: How much detail is in the graph is trust... + - John (unsaid, for the notes only): Trust is for sure not binary, but within a given context that value for the green in the trust graph might become infinitely close to 1. +- Every entity that runs a SCITT instance will have a choice of who they trust +- We want to try to give you a simple solution that + +--- + +DRAFT SCITT Agenda, IETF 115, London, UK +Donnerstag, 10. November 2022 +09:30 - 11:30 Thursday Session I + +1. Welcome, Agenda Bashing (Chairs, 5 min) + +2. Architecture (TBD, 20 min) +draft-birkholz-scitt-architecture-02 + +2. Software Supply Chain Uses Cases for SCITT (TBD, 30 min) +draft-birkholz-scitt-software-use-cases-00 + +3. Hackathon Report (TBD, 30 min) + +4. SCITT Receipt Report from COSE (TBD, 20 min) + +5. AOB (Open Mic) & Next Steps (Chairs, 15 min) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0069/index.md b/docs/discussions/alice_engineering_comms/0069/index.md new file mode 100644 index 0000000000..c4e15681d4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0069/index.md @@ -0,0 +1 @@ +# 2022-10-28 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0069/reply_0000.md b/docs/discussions/alice_engineering_comms/0069/reply_0000.md new file mode 100644 index 0000000000..0b7f2b6b40 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0069/reply_0000.md @@ -0,0 +1,15 @@ +- https://twitter.com/0x_philbert/status/1585805986048233472?s=20&t=EQzvXUz0Kz3T-IwKQm2e2Q +- Sequence for mental model docs + - alice as Ghost in brain + - We pick Her out of our head with two fingers + - we ask her + - whoooo + - Are + - Youuuu? + - she helps us look in now that shes out + - We write it all down + - here is where we define the multi context parallel conscious state mental model and map that to the dataflow description + - This is probably also where the draft example sequence (downloder.py) original improve dataflow docs code should go. + - https://github.com/intel/dffml/issues/1279#issuecomment-1025267749 + - We give her stack of software pancakes that say EAT me + - She grows to our size \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0070/index.md b/docs/discussions/alice_engineering_comms/0070/index.md new file mode 100644 index 0000000000..cd150502a9 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0070/index.md @@ -0,0 +1 @@ +# 2022-10-29 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0070/reply_0000.md b/docs/discussions/alice_engineering_comms/0070/reply_0000.md new file mode 100644 index 0000000000..7bc365ba71 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0070/reply_0000.md @@ -0,0 +1,12 @@ +- https://twitter.com/kelseyhightower/status/1586005703184945152?s=20&t=k6TbZZWA9-0eSSQRO9o10Q +- https://www.princeton.edu/~wbialek/rome/refs/kelly_56.pdf + - Vol 3 + - > If the input symbols to a communication channel represent the outcomes of a chance event on which bets are available at odds consistent with their probabilities (i.e., “fair” odds), a gambler can use the knowledge given him by the received symbols to cause his money to grow exponentially. The maximum exponential rate of growth of the gambler’s capital is equal to the rate of transmission of information over the channel. This result is generalized to include the case of arbitrary odds. + > + > Thus we find a situation in which the transmission rate is significant even though no coding is contemplated. Previously this quantity was given significance only by a theorem of Shannon’s which asserted that, with suitable encoding, binary digits could be transmitted over the channel at this rate with an arbitrarily small probability of error. + +dump some offline notes from months ago: + +G 11:6, 3:22 + +We are beginning to accelerate in time as knowledge travels faster. As learning happens faster and taking action on those learnings due to agent parallelization trains of thought executed overlap as aligned. The more system contexts plus state of consciousness (feature data plus overlayed strategic plans) we have the fast time goes relatively in that thread (much like in the animated Hercules, the threads of time, the more twine in the thread the more thread passes through the eye of a needle. The higher the throughput in that thread of time. Since we think in parallel and conceptually but we are only visualizing system contexts plus state of human understood state of consciousness combined as a thread right now, the thread of time the witch holds. That thread represents one persons life. If you look at a persons life as a string which is ever growing so long as they are alive. Say the number of pieces of twine in that string were equal parts divisible by every state of human consciousness we understand they were ever in, so if we did a subset of every state of consciousness we understand as humans, this subset being if they were in deep sleep for 1/4 of their lives, in restless sleep for 1/4, in high alert state for 1/4, and in regular alertness for 1/4. Then we’d see four twines making up the string. If you visualize those as actions, good deeds, bad deeds, then you can classify everything into pieces of twine for either good or bad path and you can see how fast a set of system contexts is progressing in the right ir wring direction. The goal is to progress in the right direction as fast as possible \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0071/index.md b/docs/discussions/alice_engineering_comms/0071/index.md new file mode 100644 index 0000000000..6412d9d4ca --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0071/index.md @@ -0,0 +1 @@ +# 2022-10-30 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0072/index.md b/docs/discussions/alice_engineering_comms/0072/index.md new file mode 100644 index 0000000000..4ff25e65e8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0072/index.md @@ -0,0 +1 @@ +# 2022-10-31 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0072/reply_0000.md b/docs/discussions/alice_engineering_comms/0072/reply_0000.md new file mode 100644 index 0000000000..7d1bcb60a8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0072/reply_0000.md @@ -0,0 +1,8 @@ +- https://trendoceans.com/atuin-linux/ +- https://docs.google.com/document/d/1xfU_s1Eu51z_WGg5VYBsQtjsKcrV6_TvFXj2WxBcj90/edit +- https://socialhub.activitypub.rocks/pub/guide-for-new-activitypub-implementers +- https://docs.microblog.pub/ +- https://raw.githubusercontent.com/rjb4standards/REA-Products/master/jsonvrf.json +- https://github.com/OR13/endor +- https://github.com/w3c/vc-data-model +- https://github.com/bluesky-social/atproto \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0073/index.md b/docs/discussions/alice_engineering_comms/0073/index.md new file mode 100644 index 0000000000..18f8c9cbbe --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0073/index.md @@ -0,0 +1 @@ +# 2022-11-01 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0073/reply_0000.md b/docs/discussions/alice_engineering_comms/0073/reply_0000.md new file mode 100644 index 0000000000..9619145257 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0073/reply_0000.md @@ -0,0 +1,32 @@ +## 2022-11-01 @pdxjohnny Engineering Logs + +- https://github.com/w3c/cogai/pull/47 + - A [call for contribution](https://www.youtube.com/watch?v=THKMfJpPt8I&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw) from the [DFFML Community](https://github.com/intel/dffml/discussions/1406?sort=new) to collaboratively [plan](https://www.youtube.com/watch?v=UIT5Bl3sepk&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw) and thereby [manifest](https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md) description of any system architecture or process flow via the [Open Architecture](https://github.com/intel/dffml/blob/alice/docs/arch/0009-Open-Architecture.rst) methodology, as well as a reference entity, [Alice](https://github.com/intel/dffml/tree/alice/entities/alice/). Their work has a [supply chain security (train of thought security) focus](https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice). +- https://en.m.wikipedia.org/wiki/L-system + - DNA permutations + - dependcy trees + - Operation valid input prameter setd from cache / seed state + - propositional logic 🕊️ +- https://github.com/w3c/cogai#cognitive-architecture + - https://github.com/w3c/cogai/blob/master/Contributing.md + - **ALIGNED** + - https://en.wikipedia.org/wiki/ACT-R + - http://act-r.psy.cmu.edu/peoplepages/ja/ja-interests.html + - **ALIGNED** (huh-Huh!) + - http://act-r.psy.cmu.edu/software/ + - We can take a look at this for reuse within our InnerSource series + - https://github.com/w3c/cogai/blob/master/minimalist.md + - Very similar to our recent research on graphql-ld + - https://github.com/w3c/cogai/blob/master/faq.md#how-do-chunks-relate-to-rdf-and-property-graphs +- https://github.com/ossf/scorecard#installation +- https://github.com/guacsec/guac/blob/main/SETUP.md +- https://github.com/rqlite/rqlite/blob/master/DOC/RESTORE_FROM_SQLITE.md +- https://github.com/marionebl/svg-term-cli +- Embrace Chaos + - Know Chaos + - Roll with Chaos + +[![EDAC21EB-8311-4E0F-BA9A-D53013109C67](https://user-images.githubusercontent.com/5950433/199291178-7e89705d-f662-44cd-aa3e-e1a24eb61256.jpeg)](https://en.wikipedia.org/wiki/Sophia_(Gnosticism)) + +- TODO + - [ ] Circle back with Melvin \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0074/index.md b/docs/discussions/alice_engineering_comms/0074/index.md new file mode 100644 index 0000000000..ad1a8fe568 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0074/index.md @@ -0,0 +1 @@ +# 2022-11-02 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0074/reply_0000.md b/docs/discussions/alice_engineering_comms/0074/reply_0000.md new file mode 100644 index 0000000000..1b271a5244 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0074/reply_0000.md @@ -0,0 +1,51 @@ +## 2022-11-02 @pdxjohnny Engineering Logs + +- Vol 3: Train of Thought Graffiti + - Making data show up on/in other data traveling over target controlled infra +- https://scitt.io/distributing-with-oci-registries.html + - https://datatracker.ietf.org/wg/scitt/about/ + - https://oras.land/ + - https://mailarchive.ietf.org/arch/msg/scitt/bOPu8GoZyGWusOOHSFsQq47Xj4Y/ + - See below todos on service endpoint +- https://github.com/w3c/cogai/pull/47 +- https://www.w3.org/People/Raggett/ + - > My current focus is on how to build **AI systems that mimic human reasoning** inspired by decades of advances in the cognitive sciences, and hundreds of millions of years of evolution of the brain. This is a major paradigm shift compared to the Semantic Web which is steeped in the Aristotelian tradition of mathematical logic and formal semantics. This will enable the **Sentient Web** as the combination of sensing, actuation and cognition federated across the Web in support of markets of services based upon open standards. + - **ALIGNED** + - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice + - > The [W3C Cognitive AI Community Group](https://www.w3.org/community/cogai/) is seeking to incubate ideas that combine symbolic information (graphs) with sub-symbolic information (statistics), rules and high performance graph algorithms. This combination enables machine learning and reasoning in the presence of uncertainty, incompleteness and inconsistencies. The starting point has been the development of the [chunks and rules format](https://github.com/w3c/cogai/blob/master/chunks-and-rules.md) as an amalgam of RDF and Property Graphs. A [series of demos](https://github.com/w3c/cogai/blob/master/demos/README.md) are being developed to explore different aspects, using an open source JavaScript library. + - **ALIGNED** + - https://github.com/intel/dffml/blob/alice/docs/arch/0009-Open-Architecture.rst +- https://www.w3.org/2002/mmi/ +- https://www.w3.org/WAI/APA/ +- https://web.archive.org/web/20200926173320/http://webinos.org/2011/06/09/webinos-whitepaper/ + - > webinos is: a collective project to make the web work for applications. webinos has a vision to build a multi-device, applications platform based on web technology that: – allows web apps to run seamlessly across multiple devices and to use resources across devices – allows web applications to communicate with other web applications and (non web components) over multiple device – links the application experience with the social network – achieves all of the above in a security preserving manner – explicitly targets the four distinct “screens”: the mobile, the PC, the in-car (automotive) and the home media (TV) devices. The intent in webinos is to translate the success of the web as a distributed document publishing system into a successful, distributed applications platform. The webinos platform should be built upon and move forward the required open standards. This platform should have a concrete implementation that is accessible to all as an open source asset. Technically, all of this should be achieved reusing the core development technologies that have already proven themselves on the Web (HTML and JavaScript), affording the benefits of speed of development and access to a large developer talent pool. The innovation webinos brings shall not just be technical; by embracing an open web culture, we hope to create an application framework that does not favour any particular corporation, and on which may parties can collaborate, and from which many companies benefit. + - https://github.com/intel/dffml/blob/3530ee0d20d1062605f82d1f5055f455f8c2c68f/docs/about.rst#philosophy +- https://en.wikipedia.org/wiki/Cognitive_tutor +- https://en.wikipedia.org/wiki/Intelligent_tutoring_system +- TODO + - [ ] Vol 4: Programing as checkers, line up the research so that you can get farther in one turn + - [ ] Time bounded search for research and time to hop (implementation) + - [ ] Demo metric scan with SCITT receipt used to auth upload results to HTTP server (stream of consciousness / webhook server). Root trust in OIDC token similar to fulcio/sigstore github actions slsa demo. + - Future + - [ ] Demo demo to OpenSSF Metrics WG for collaboration on DB + - [ ] Do this for each `Input` + - [ ] Instead of HTTP server the context addressable registry + - [ ] Link via DWNs + - [ ] Hardware rooted keys + - [ ] Kinit above together with a `I/L/R/OP/OPIMPNetwork`s for distributed compute + - [ ] Trust anchors of other than self support + - [ ] Caching + + +--- + +- We hope that this work will aid in a heightening of train of thought security posture. +- Our objective is to increase aggregate train of thought security posture. +- Our objective is to increase the aggregate train of thought security +- Supply chain security posture +- The aggregate security of the software supply chain +- The security of the aggregate software supply chain +- The security of the software supply chain in the aggregate +- Heightening of the security of the collective train of thought. +- Heightening of state of art in train of thought security posture. +- We want to secure our thought processes \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0074/reply_0001.md b/docs/discussions/alice_engineering_comms/0074/reply_0001.md new file mode 100644 index 0000000000..f67bdf7bd6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0074/reply_0001.md @@ -0,0 +1,41 @@ +## 2022-11-02 Harsh/John + +- https://github.com/intel/dffml/issues/596#issuecomment-1301191994 +- Installed VS Code build tools and used the developer prompt from there and it worked +- Remembered pipdeptree exists +- We should use https://github.com/tox-dev/pipdeptree and integrate that into shouldi. + +``` + -j, --json Display dependency tree as json. This will yield "raw" + output that may be used by external tools. This option + overrides all other options. +``` + +- https://intel.github.io/dffml/main/examples/shouldi.html +- https://intel.github.io/dffml/main/contributing/dev_env.html + +```console +$ git clone https://github.com/intel/dffml +$ cd dffml +$ python -m venv .venv +$ git checkout -b deptree +$ . .venv/Scripts/activate +$ pip install -e .[dev] +$ cd examples/shouldi +$ pip install -e .[dev] +``` + +- https://intel.github.io/dffml/main/api/util/packaging.html#dffml.util.packaging.mkvenv +- https://github.com/tox-dev/pipdeptree#running-in-virtualenvs + +https://github.com/intel/dffml/blob/b892cfab9bd152c47a709e8708491c95b8c3ec8e/tests/docs/test_consoletest.py#L14 + +- Basic testcase will be to analyze shouldi itself + +https://github.com/intel/dffml/blob/3530ee0d20d1062605f82d1f5055f455f8c2c68f/dffml/util/testing/consoletest/commands.py#L83-L190 + +- Opens + - Pip not installing to virtualenv we created (using different Python despite our current efforts) +- TODO + - [ ] Harsh to investigate refactoring `ActivateVirtualEnvCommand` into something that doesn't mess with `os.environ` and behaves more like `mkvenv()` (https://github.com/intel/dffml/tree/main/dffml/util/testing/consoletest/) + - [ ] Explicitly use path returned from venv creation as zeroith argument to `dffml.run_command()/subprocess.check_call()` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0075/index.md b/docs/discussions/alice_engineering_comms/0075/index.md new file mode 100644 index 0000000000..7539a98fce --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0075/index.md @@ -0,0 +1 @@ +# 2022-11-03 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0075/reply_0000.md b/docs/discussions/alice_engineering_comms/0075/reply_0000.md new file mode 100644 index 0000000000..5bf7f64be7 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0075/reply_0000.md @@ -0,0 +1,208 @@ +## 2022-11-03 @pdxjohnny Engineering Logs + +- https://identity.foundation/presentation-exchange/spec/v2.0.0/ +- https://github.com/geyang/plan2vec +- http://tkipf.github.io/ + - https://github.com/tkipf/gae + - Graph Auto Encoders + - https://github.com/tkipf/c-swm + - > Contrastive Learning of Structured World Models + > Abstract: A structured understanding of our world in terms of objects, relations, and hierarchies is an important component of human cognition. Learning such a structured world model from raw sensory data remains a challenge. As a step towards this goal, we introduce Contrastively-trained Structured World Models (C-SWMs). C-SWMs utilize a contrastive approach for representation learning in environments with compositional structure. We structure each state embedding as a set of object representations and their relations, modeled by a graph neural network. This allows objects to be discovered from raw pixel observations without direct supervision as part of the learning process. We evaluate C-SWMs on compositional environments involving multiple interacting objects that can be manipulated independently by an agent, simple Atari games, and a multi-object physics simulation. Our experiments demonstrate that C-SWMs can overcome limitations of models based on pixel reconstruction and outperform typical representatives of this model class in highly structured environments, while learning interpretable object-based representations. +- https://filebase.com/blog/5-ipfs-use-cases-you-havent-thought-of-yet/ (or maybe they're exactly what we've thought of ;) + - > 1. Distributed Package Management + > Package managers, like NPM, are typically stored and managed in a centralized manner. By hosting software packages on IPFS, they can be stored in a distributed manner that is publicly available. Any changes to the package’s versions, like a bug fix, will be reflected by a new CID value, allowing for verification of updates and tracking package development. + > + > 2. Hosting Software Containers + > Software containers, like Docker containers, are available through registries like the Docker registry. This is similar to pulling a package from NPM, but for software containers rather than packages. By using IPFS to host your own registry, there isn’t any domain hosting configuration, DNS management, or user permission management. Simply use the IPFS CID with an IPFS HTTP gateway inside a curl command rather than use a docker pull command to download the container’s image. + > + > 3. Decentralized eCommerce websites + > Through packages like DeCommerce, spinning up your own eCommerce website is as simple as uploading the DeCommerce folder to your Filebase bucket, then navigating to the IPFS HTTP gateway URL of your folder’s CID. Since you’re equipped with all the necessary webpages and configurations, you can spend time customizing the CSS files to style your website and upload your products, rather than spending time managing a domain, SSL certificates, or figuring out how to accept crypto payments (which DeCommerce comes equipped with by default!). + > + > 4. Decentralized Operating Systems + > Along with decentralized software packages and containers, decentralized operating systems are another form of software that can benefit from being hosted on IPFS. A handful of decentralized, blockchain-based operating systems have emerged, but storing the data for these operating systems on their native blockchain is typically against best practices since it can be expensive and have high latency. For this reason, many layer-1 blockchains will either store data externally, like on IPFS, or they’ll use a layer-2 chain to handle data storage. Therefore, decentralized operating systems that run on a blockchain can highly benefit from being hosted on IPFS while they communicate externally with the blockchain network. + > + > 5. Decentralized Peer Reviews of Academic Research Papers + > In addition to JPEG art being minted as NFT collections, pieces of writing such as blog posts, eBooks, and whitepapers have begun to gain traction as NFTs as well. Written content benefits from being minted on a blockchain since it verifies who the original writer of the content is, allowing for easier clarification when it comes to copyright, plagiarism, or other duplication of writing. Any text document or Microsoft Word document can be hosted on IPFS and then referenced inside of a smart contract that is deployed on Ethereum or Polygon, creating a permanent record of that piece of writing being created by the author. + > For academic papers, this is a real game changer. Users can mint their research papers as an NFT that uses PDF or text documents hosted on IPFS, and then gain a verifiable reputation for their research and any peer reviews they contribute to other researchers. In addition to the smart contract’s verifiable address, the IPFS CID can be used as an additional form of verification that the content was created by the original author and hasn’t been altered since publication. +- Carbon aware SDK + - https://github.com/Green-Software-Foundation/carbon-aware-sdk +- Metrics for carbon measurement + - Software Carbon Intensity (SCI) - taking action + - Greenhouse Gas Protocol (GHG) - reporting +- Carbon measurement telemetry + - https://github.com/sustainable-computing-io/kepler + - > Kepler (Kubernetes-based Efficient Power Level Exporter) uses eBPF to probe energy related system stats and exports as Prometheus metrics + - https://github.com/hubblo-org/scaphandre + - > Energy consumption metrology agent. Let "scaph" dive and bring back the metrics that will help you make your systems and applications more sustainable ! + +```console +$ pip install -e entities/alice +$ dffml service dev entrypoints list dffml.overlays.alice.please.log.todos +OverlayCLI = alice.please.log.todos.todos:OverlayCLI -> alice 0.0.1 (/home/pdxjohnny/.local/lib/python3.9/site-packages) +OverlayRecommendedCommunityStandards = alice.please.log.todos.todos:AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues -> alice 0.0.1 (/home/pdxjohnny/.local/lib/python3.9/site-packages) +$ dffml service dev export -configloader json alice.cli:AlicePleaseLogTodosCLIDataFlow | tee logtodos.json +$ (echo '```mermaid' && dffml dataflow diagram logtodos.json && echo '```') | gh gist create -f "LOG_TODOS_DATAFLOW_DIAGRAM.md" - +``` + +- Oneliner: `dffml service dev export -configloader json alice.cli:AlicePleaseLogTodosCLIDataFlow | tee logtodos.json && (echo '```mermaid' && dffml dataflow diagram logtodos.json && echo '```') | gh gist create -f "LOG_TODOS_DATAFLOW_DIAGRAM.md" -` + + +```mermaid +graph TD +subgraph a759a07029077edc5c37fea0326fa281[Processing Stage] +style a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a +subgraph d9f2c7ced7f00879629c15363c8e307d[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url] +style d9f2c7ced7f00879629c15363c8e307d fill:#fff4de,stroke:#cece71 +37178be7db9283b44a1786fef58ffa8d[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url] +5c7743e872c165030dcf051c712106fc(repo_string) +5c7743e872c165030dcf051c712106fc --> 37178be7db9283b44a1786fef58ffa8d +8d32e3f614b2c8f9d23e7469eaa1da12(result) +37178be7db9283b44a1786fef58ffa8d --> 8d32e3f614b2c8f9d23e7469eaa1da12 +end +subgraph ed8e05e445eabbcfc1a201e580b1371e[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guessed_repo_string_is_operations_git_url] +style ed8e05e445eabbcfc1a201e580b1371e fill:#fff4de,stroke:#cece71 +f129d360149fb01bbfe1ed8c2f9bbaa2[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guessed_repo_string_is_operations_git_url] +77a8695545cb64a7becb9f50343594c3(repo_url) +77a8695545cb64a7becb9f50343594c3 --> f129d360149fb01bbfe1ed8c2f9bbaa2 +d259a05785074877b9509ed686e03b3a(result) +f129d360149fb01bbfe1ed8c2f9bbaa2 --> d259a05785074877b9509ed686e03b3a +end +subgraph 0fb0b360e14eb7776112a5eaff5252de[alice.please.log.todos.todos.OverlayCLI:cli_has_repos] +style 0fb0b360e14eb7776112a5eaff5252de fill:#fff4de,stroke:#cece71 +81202a774dfaa2c4d640d25b4d6c0e55[alice.please.log.todos.todos.OverlayCLI:cli_has_repos] +7ba42765e6fba6206fd3d0d7906f6bf3(cmd) +7ba42765e6fba6206fd3d0d7906f6bf3 --> 81202a774dfaa2c4d640d25b4d6c0e55 +904eb6737636f1d32a6d890f449e9081(result) +81202a774dfaa2c4d640d25b4d6c0e55 --> 904eb6737636f1d32a6d890f449e9081 +end +subgraph 964c0fbc5f3a43fce3f0d9f0aed08981[alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo] +style 964c0fbc5f3a43fce3f0d9f0aed08981 fill:#fff4de,stroke:#cece71 +b96195c439c96fa7bb4a2d616bbe47c5[alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo] +2a071a453a1e677a127cee9775d0fd9f(cmd) +2a071a453a1e677a127cee9775d0fd9f --> b96195c439c96fa7bb4a2d616bbe47c5 +f6bfde5eece6eb52bb4b4a3dbc945d9f(result) +b96195c439c96fa7bb4a2d616bbe47c5 --> f6bfde5eece6eb52bb4b4a3dbc945d9f +end +subgraph 2e2e8520e9f9420ffa9e54ea29965019[alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo] +style 2e2e8520e9f9420ffa9e54ea29965019 fill:#fff4de,stroke:#cece71 +f60739d83ceeff1b44a23a6c1be4e92c[alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo] +0ac5645342c7e58f9c227a469d90242e(repo) +0ac5645342c7e58f9c227a469d90242e --> f60739d83ceeff1b44a23a6c1be4e92c +6e82a330ad9fcc12d0ad027136fc3732(result) +f60739d83ceeff1b44a23a6c1be4e92c --> 6e82a330ad9fcc12d0ad027136fc3732 +end +subgraph b8e0594907ccea754b3030ffc4bdc3fc[alice.please.log.todos.todos:gh_issue_create_support] +style b8e0594907ccea754b3030ffc4bdc3fc fill:#fff4de,stroke:#cece71 +6aeac86facce63760e4a81b604cfab0b[alice.please.log.todos.todos:gh_issue_create_support] +dace6da55abe2ab1c5c9a0ced2f6833d(file_present) +dace6da55abe2ab1c5c9a0ced2f6833d --> 6aeac86facce63760e4a81b604cfab0b +d2a58f644d7427227cefd56492dfcef9(repo) +d2a58f644d7427227cefd56492dfcef9 --> 6aeac86facce63760e4a81b604cfab0b +7f2eb20bcd650dc00cde5ca0355b578f(issue_url) +6aeac86facce63760e4a81b604cfab0b --> 7f2eb20bcd650dc00cde5ca0355b578f +end +subgraph cd002409ac60a3eea12f2139f2743c52[alice.please.log.todos.todos:git_repo_to_git_repository_checked_out] +style cd002409ac60a3eea12f2139f2743c52 fill:#fff4de,stroke:#cece71 +e58ba0b1a7efba87321e9493d340767b[alice.please.log.todos.todos:git_repo_to_git_repository_checked_out] +00a9f6e30ea749940657f87ef0a1f7c8(repo) +00a9f6e30ea749940657f87ef0a1f7c8 --> e58ba0b1a7efba87321e9493d340767b +bb1abf628d6e8985c49381642959143b(repo) +e58ba0b1a7efba87321e9493d340767b --> bb1abf628d6e8985c49381642959143b +end +subgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL] +style d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71 +f577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL] +7440e73a8e8f864097f42162b74f2762(URL) +7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40 +8e39b501b41c5d0e4596318f80a03210(valid) +f577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210 +end +subgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo] +style af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71 +155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo] +eed77b9eea541e0c378c67395351099c(URL) +eed77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5 +8b5928cd265dd2c44d67d076f60c8b05(ssh_key) +8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5 +4e1d5ea96e050e46ebf95ebc0713d54c(repo) +155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c +6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL} +6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5 +end +subgraph 98179e1c9444a758d9565431f371b232[dffml_operations_innersource.operations:code_of_conduct_present] +style 98179e1c9444a758d9565431f371b232 fill:#fff4de,stroke:#cece71 +fb772128fdc785ce816c73128e0afd4d[dffml_operations_innersource.operations:code_of_conduct_present] +f333b126c62bdbf832dddf105278d218(repo) +f333b126c62bdbf832dddf105278d218 --> fb772128fdc785ce816c73128e0afd4d +1233aac886e50641252dcad2124003c9(result) +fb772128fdc785ce816c73128e0afd4d --> 1233aac886e50641252dcad2124003c9 +end +subgraph d03657cbeff4a7501071526c5227d605[dffml_operations_innersource.operations:contributing_present] +style d03657cbeff4a7501071526c5227d605 fill:#fff4de,stroke:#cece71 +8da2c8a3eddf27e38838c8b6a2cd4ad1[dffml_operations_innersource.operations:contributing_present] +2a1ae8bcc9add3c42e071d0557e98b1c(repo) +2a1ae8bcc9add3c42e071d0557e98b1c --> 8da2c8a3eddf27e38838c8b6a2cd4ad1 +52544c54f59ff4838d42ba3472b02589(result) +8da2c8a3eddf27e38838c8b6a2cd4ad1 --> 52544c54f59ff4838d42ba3472b02589 +end +subgraph da39b149b9fed20f273450b47a0b65f4[dffml_operations_innersource.operations:security_present] +style da39b149b9fed20f273450b47a0b65f4 fill:#fff4de,stroke:#cece71 +c8921544f4665e73080cb487aef7de94[dffml_operations_innersource.operations:security_present] +e682bbcfad20caaab15e4220c81e9239(repo) +e682bbcfad20caaab15e4220c81e9239 --> c8921544f4665e73080cb487aef7de94 +5d69c4e5b3601abbd692ade806dcdf5f(result) +c8921544f4665e73080cb487aef7de94 --> 5d69c4e5b3601abbd692ade806dcdf5f +end +subgraph 062b8882104862540d584516edc60008[dffml_operations_innersource.operations:support_present] +style 062b8882104862540d584516edc60008 fill:#fff4de,stroke:#cece71 +5cc75c20aee40e815abf96726508b66d[dffml_operations_innersource.operations:support_present] +f0e4cd91ca4f6b278478180a188a2f5f(repo) +f0e4cd91ca4f6b278478180a188a2f5f --> 5cc75c20aee40e815abf96726508b66d +46bd597a57e034f669df18ac9ae0a153(result) +5cc75c20aee40e815abf96726508b66d --> 46bd597a57e034f669df18ac9ae0a153 +end +end +subgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage] +style a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a +end +subgraph 58ca4d24d2767176f196436c2890b926[Output Stage] +style 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a +end +subgraph inputs[Inputs] +style inputs fill:#f6dbf9,stroke:#a178ca +6e82a330ad9fcc12d0ad027136fc3732 --> 5c7743e872c165030dcf051c712106fc +8d32e3f614b2c8f9d23e7469eaa1da12 --> 77a8695545cb64a7becb9f50343594c3 +128516cfa09b0383023eab52ee24878a(seed
dffml.util.cli.CMD) +128516cfa09b0383023eab52ee24878a --> 7ba42765e6fba6206fd3d0d7906f6bf3 +128516cfa09b0383023eab52ee24878a(seed
dffml.util.cli.CMD) +128516cfa09b0383023eab52ee24878a --> 2a071a453a1e677a127cee9775d0fd9f +904eb6737636f1d32a6d890f449e9081 --> 0ac5645342c7e58f9c227a469d90242e +f6bfde5eece6eb52bb4b4a3dbc945d9f --> 0ac5645342c7e58f9c227a469d90242e +46bd597a57e034f669df18ac9ae0a153 --> dace6da55abe2ab1c5c9a0ced2f6833d +bb1abf628d6e8985c49381642959143b --> d2a58f644d7427227cefd56492dfcef9 +4e1d5ea96e050e46ebf95ebc0713d54c --> 00a9f6e30ea749940657f87ef0a1f7c8 +d259a05785074877b9509ed686e03b3a --> 7440e73a8e8f864097f42162b74f2762 +d259a05785074877b9509ed686e03b3a --> eed77b9eea541e0c378c67395351099c +a6ed501edbf561fda49a0a0a3ca310f0(seed
git_repo_ssh_key) +a6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05 +8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce +bb1abf628d6e8985c49381642959143b --> f333b126c62bdbf832dddf105278d218 +bb1abf628d6e8985c49381642959143b --> 2a1ae8bcc9add3c42e071d0557e98b1c +bb1abf628d6e8985c49381642959143b --> e682bbcfad20caaab15e4220c81e9239 +bb1abf628d6e8985c49381642959143b --> f0e4cd91ca4f6b278478180a188a2f5f +end +``` + +```console +$ alice please log todos -log debug -repos https://github.com/pdxjohnny/testaaa +``` + +- Got `alice please log todos` (slimmed down version of `alice please contribute`) working https://github.com/intel/dffml/commit/adf32b4e80ad916de7749fc0b6e99485fb4107b7 + - This will allow us to not deal with the pull request code unless triggered. + - Without the overlay infra complete it's harder to remove ops / modify flows than it is to add to them (static overlay application is what we have and is easy, it's just auto flow the definitions together) +- TODO + - [ ] Added `alice please log todos` command adf32b4e80ad916de7749fc0b6e99485fb4107b7 + - [ ] Find tutorial location for this, maybe just with data flows stuff +- Future + - [ ] Alice refactor and optimize for reduced carbon emissions + - [ ] Integrate into PR feedback loop \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0076/index.md b/docs/discussions/alice_engineering_comms/0076/index.md new file mode 100644 index 0000000000..d015241785 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0076/index.md @@ -0,0 +1 @@ +# 2022-11-04 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0076/reply_0000.md b/docs/discussions/alice_engineering_comms/0076/reply_0000.md new file mode 100644 index 0000000000..dffbdbb724 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0076/reply_0000.md @@ -0,0 +1,162 @@ +## 2022-11-04 @pdxjohnny Engineering Logs + +- Issue Ops as a way for people to request Alice pull requests, contributions, interaction, etc. + - https://github.com/valet-customers/issue-ops/blob/6a5e64188ae79dfd11613f5f9bdc75f7b769812b/.github/workflows/issue_ops.yml + - https://github.com/valet-customers/issue-ops/blob/6a5e64188ae79dfd11613f5f9bdc75f7b769812b/.github/ISSUE_TEMPLATE/gitlab_ci.md +- How do we communicate and document when there is new data available or we plan to make new data available. +- How do we uqyer and correlate across sources? +- VEX (JSON-LD?) + - Statuses + - Investigating + - Vulnerable + - Used but not vulnerable + - This version is vuln (to vuln or dep vuln) but we have another one that's not effected + - We will need to establish chains of trust on top of VDR / VEX issuance + - https://cyclonedx.org/capabilities/vdr/#bom-with-embedded-vdr + - https://www.nist.gov/itl/executive-order-14028-improving-nations-cybersecurity/software-security-supply-chains-software-1 + - https://cyclonedx.org/capabilities/vex/ + - https://energycentral.com/c/pip/what-nist-sbom-vulnerability-disclosure-report-vdr + - https://github.com/CycloneDX/bom-examples/blob/master/SaaSBOM/apigateway-microservices-datastores/bom.json +- InnerSource + - https://innersourcecommons.org/learn/patterns/ + - https://github.com/InnerSourceCommons/InnerSourcePatterns + - https://www.youtube.com/watch?v=RjBpZKsAQN0 + - A RedMonk Conversation: IBM's Inner Source transformation, scaling a DevOps culture change. +- GitHub Actions + - https://docs.github.com/en/actions/using-jobs/using-concurrency#example-only-cancel-in-progress-jobs-or-runs-for-the-current-workflow + - https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#concurrency +- https://code-as-policies.github.io/ + - Need to look into this more + - https://colab.research.google.com/drive/1V9GU70GQN-Km4qsxYqvR-c0Sgzod19-j + - https://ai.googleblog.com/2022/11/robots-that-write-their-own-code.html + - https://web1.eng.famu.fsu.edu/~mpf/research.htm + - > Central to this approach is hierarchical code generation, which prompts language models to recursively define new functions, accumulate their own libraries over time, and self-architect a dynamic codebase. + - Yup +- https://twitter.com/MikePFrank/status/1588539750423547905 + - Reversible Computing + - Essentially what we get when we cache our flows plus all our equilibrium reaching time travel stuff (synchronization of system contexts across disparate roots, aka cherry picking patches and A/B validation of results until we reach desired state) + - https://en.wikipedia.org/wiki/Reversible_computing +- http://hiis.isti.cnr.it/serenoa/project-fact-sheet.html + - Some similar principles to ours + - > - New concepts, languages, (intelligent) runtimes and tools are needed to support multi-dimensional context-aware adaptation of SFEs. h ese artefacts will enable SFE engineers to concentrate on the functionality rather than on the implementation details concerning the adaptation to the multiple dimensions of the context of use. + > - Keeping Humans in the Loop. h is principle is twofold. On the one hand, end users should be able to provide feedback or even guide the adaptation process according to their preferences or previous experiences with the system. On the other hand, authors, developers and engineers should be able to guide the adaptation process according to their experience and domain knowledge. + > - Open Adaptiveness. A system is open adaptive “if new adaptation plans can be introduced during runtime”. - Adaptation in ubiquitous computing environments (such as in ambient spaces) is also necessary in order to deal with multiple devices, interaction resources and modalities. + > - Covering the full adaptation lifecycle to support a full adaptation life-cycle that will result into feedback loops (coming from end users) in order to inform any future adaptation + +```python +async def gh_issue_create_if_file_not_present( + repo_url: str, + file_present: bool, + title: str, + body: str, + logger: logging.Logger, +) -> Dict[str, str]: + if file_present: + return + return { + "issue_url": await gh_issue_create( + repo_url, + title, + body, + logger=logger, + ) + } + + +""" +def make_gh_issue_create_opimp_for_file( + filename: str, + file_present_definition, + default_title: str, + body: str, +): + IssueTitle = NewType(filename + "IssueTitle", str) + IssueBody = NewType(filename + "IssueBody", str) + IssueURL = NewType(filename + "IssueURL", str) + + # TODO, + # NOTE dffml.op requires name set in overlay classes for now + + return new_types, opimp +""" + + +# : dffml_operations_innersource.operations.FileReadmePresent +class AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues: + @dffml.op( + inputs={ + "repo": dffml_feature_git.feature.definitions.git_repository_checked_out, + "file_present": dffml_operations_innersource.operations.FileSupportPresent, + "title": SupportIssueTitle, + "body": SupportIssueBody, + }, + outputs={ + "issue_url": NewType("SupportIssueURL", str), + }, + ) + async def gh_issue_create_support( + repo: dffml_feature_git.feature.definitions.git_repository_checked_out.spec, + file_present: bool, + title: str, + body: str, + ) -> Dict[str, str]: + return await gh_issue_create_if_file_not_present( + repo.URL, + file_present, + title, + body, + logger=self.logger, + ) + + +""" +cls = AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues +for new_types, opimp in itertools.starmap( + make_gh_issue_create_opimp_for_file, + [ + ("Support", dffml_operations_innersource.operations.FileSupportPresent), + ("Contributing", dffml_operations_innersource.operations.FileContributingPresent), + ("CodeOfConduct", dffml_operations_innersource.operations.FileCodeOfConductPresent), + ("Security", dffml_operations_innersource.operations.FileSecurityPresent), + ], +): + setattr(cls, opimp.op.name, ) + for new_type in new_types: + print(new_type, new_type.__dict__) +""" +``` + +- alice: please: log: todos: recommended community standard: support: github issue: Allow for title and body override + - 67d79ede39629f3b117be0d9f2b5058f88b4efcb +- e2ed7faaa alice: please: log: todos: recommended community standard: code of conduct: github issue: Log issue if file not found +- 8b0df460a alice: please: log: todos: recommended community standard: contributing: github issue: Log issue if file not found +- dbb946649 alice: please: log: todos: recommended community standard: security: github issue: Log issue if file not found +- 59d3052f9 alice: please: log: todos: recommended community standard: Cleanup comments +- 5dbadaf36 operations: innersource: Check for README community health file +- d867a9cda alice: please: log: todos: recommended community standard: readme: github issue: Log issue if file not found + +![image](https://user-images.githubusercontent.com/5950433/200097693-4207fe5c-6d0d-4bfb-8d75-d57bd5768616.png) + +![image](https://user-images.githubusercontent.com/5950433/200098670-1085a185-71af-4193-b5ca-5740d42c952d.png) + +- Ran the three most recent Alice commands to confirm everything is still working + - `alice shouldi contribute` + - `alice please log todos` + - `alice please contribute recommended community standards` + +```console +$ alice -log debug shouldi contribute -keys https://github.com/pdxjohnny/testaaa +$ alice please log todos -log debug -keys https://github.com/pdxjohnny/testaaa +$ alice please contribute -repos https://github.com/pdxjohnny/testaaa -log debug -- recommended community standards +``` + +- 7980fc0c7 util: cli: cmd: Add DFFMLCLICMD NewType for use in data flows +- 6d0ce54e1 cli: dataflow: run: records: Allow for passing CLI CMD instance to data flow as input +- 0356b97a9 alice: cli: please: contribute: recommended community standards: Use CLI CMD type from dffml +- 3e8b161a2 alice: cli: please: log: todos: Use CLI CMD type from dffml +- 7c7dd8f7c alice: cli: please: log: todos: Base off dffml dataflow run records +- 1d4d6b2f8 alice: cli: please: log: todos: Explictly pass directory when finding last repo commit +- TODO + - [ ] SaaSBOM etc. overlays for dataflows for `THREATS.md` analysis + - https://github.com/CycloneDX/bom-examples/tree/6990885/SaaSBOM/apigateway-microservices-datastores + - [ ] Find a cleaner way to do same operation reused with different definitions (and defaults) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0077/index.md b/docs/discussions/alice_engineering_comms/0077/index.md new file mode 100644 index 0000000000..f7f7975853 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0077/index.md @@ -0,0 +1 @@ +# 2022-11-05 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0077/reply_0000.md b/docs/discussions/alice_engineering_comms/0077/reply_0000.md new file mode 100644 index 0000000000..b04259cc9c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0077/reply_0000.md @@ -0,0 +1,4 @@ +- https://pretalx.com/pycascades-2023/cfp +- Vol 0: Alice is a Sign not a Cop + - mention conceptual cultural opamp effects of any change (wheel, ML). Information travels faster as a result of some changes. + - grep Wardly map alignment reward strategics plan hypothesis think \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0078/index.md b/docs/discussions/alice_engineering_comms/0078/index.md new file mode 100644 index 0000000000..0d6e8351f9 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0078/index.md @@ -0,0 +1 @@ +# 2022-11-06 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0078/reply_0000.md b/docs/discussions/alice_engineering_comms/0078/reply_0000.md new file mode 100644 index 0000000000..2b711e4e96 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0078/reply_0000.md @@ -0,0 +1,41 @@ +## 2022-11-06 @pdxjohnny Engineering Logs + +- RosettaNet EDI +- https://www.youtube.com/watch?v=ToihJtuELwM + - Methodology for long term storage of verifiable credentials encoded to vol 3 plus vol 5 aware text as prompt for best practices for trust graph inference strategic plan high accuracy for adherence to goals with regards to happiness metrics and fail safe ad-hoc group forming. +- https://colab.research.google.com/drive/1Hl0xxODGWNJgcbvSDsD5MN4B2nz3-n7I?usp=sharing#scrollTo=GDlskFoGYDVt + - GPT-3 but better + - Flan: grep: EAT me (few days ago in this thread) perfect + - this engagement fits in with visualization of software stack of pancakes to grow Alice 🥞 (if this works we will hopefully start accelerating quickly, as we accelerate time for her slows) + - Summary of the following (Alice thread) in the style of a avxrh or whatever paper + - Concept of open architecture as an IETF RFC: ^ + - Install Flan and associated DFFML overlays within OS DecentrAlice. + - What is a Large Language Model? + - LLMs essentially act as intelligent lookup tables where the promt is like the SQL query + - See gather_inputs call within memory context method, implement prioritizer there (dont try to refactor into dataflow as class first!) +- https://comunica.github.io/comunica-feature-link-traversal-web-clients/builds/default/#datasources=https://foaf-ldux.vercel.app/&query=PREFIX%20foaf:%20%3Chttp://xmlns.com/foaf/0.1/%3E%0ASELECT%20%20DISTINCT%20?Name%20%3FWebID%20WHERE%20%7B%0A%20%20%3Chttps%3A%2F%2Ffoaf-ldux.vercel.app%2F%23me%3E%20foaf%3Aknows%20%3FWebID.%0A%20%20%3FWebID%20foaf%3Aname%20%3FName.%0A%7D&httpProxy=https%3A%2F%2Fproxy.linkeddatafragments.org%2F + - https://twitter.com/mfosterio/status/1589368256086781952 + - https://github.com/comunica/comunica + - https://gist.github.com/rubensworks/9d6eccce996317677d71944ed1087ea6 + - Grapql-LD + - 🛤️⛓️🚄 + - > Linked Data on the Web exists in many shapes and forms. Linked Data can be published using plain RDF files in various syntaxes, such as JSON-LD, Turtle, HTML+RDFa, and more. Next to that, different forms of queryable Web interfaces exist, such as SPARQL endpoints and Triple Pattern Fragments (TPF) interfaces. If we want to query Linked Data from the Web, we need to be able to cope with this heterogeneity. Comunica is a quering framework that has been designed to handle different types of Linked Data interfaces in a flexible manner. Its primary goal is executing SPARQL queries over one or more interfaces. Comunica is a meta-query engine Comunica should not be seen as a query engine. Instead, Comunica is a meta query engine using which query engines can be created. It does this by providing a set of modules that can be wired together in a flexible manner. While we provide default configurations of Comunica to easily get started with querying, anyone can configure their own query engine. This fine-tuning of Comunica to suit your own needs, and avoiding the overhead of modules that are not needed. + - We want to combine this with SCITT + - https://github.com/lacanoid/pgsparql +- https://dust.tt/ + - Looks like data flow/notebook hybrid! Cool! But closed source APIs is what are available so far. +- https://colab.research.google.com/drive/1PDT-jho3Y8TBrktkFVWFAPlc7PaYvlUG?usp=sharing + - Ebook Embeddings Search +- https://www.themarginalian.org/2022/11/02/anais-nin-d-h-lawrence/ + - > Life is a process of becoming, a combination of states we have to go through. Where people fail is that they wish to elect a state and remain in it. This is a kind of death. +- https://www.themarginalian.org/2014/11/11/dostoyevsky-dream/ + - > All are tending to one and the same goal, at least all aspire to the same goal, from the wise man to the lowest murderer, but only by different ways. It is an old truth, but there is this new in it: I cannot go far astray. I saw the truth. I saw and know that men could be beautiful and happy, without losing the capacity to live upon the earth. I will not, I cannot believe that evil is the normal condition of men… I saw the truth, I did not invent it with my mind. I saw, saw, and her living image filled my soul for ever. I saw her in such consummate perfection that I cannot possibly believe that she was not among men. How can I then go astray? … The living image of what I saw will be with me always, and will correct and guide me always. Oh, I am strong and fresh, I can go on, go on, even for a thousand years. + > […] + > And it is so simple… The one thing is — love thy neighbor as thyself — that is the one thing. That is all, nothing else is needed. You will instantly find how to live. +- Extensible Dynamic Edge Network (EDEN) + - https://magicmirror.builders/ + - https://android-developers.googleblog.com/2019/02/an-update-on-android-things.html + - Fuck, they cut the project, that's okay we'll maybe run TockOS (lol, tick tock, appropriate :) + - https://github.com/tock/tock + +![eden](https://user-images.githubusercontent.com/5950433/200349932-91555c81-38cf-4a90-9074-fea92a6aa974.jpeg) diff --git a/docs/discussions/alice_engineering_comms/0079/index.md b/docs/discussions/alice_engineering_comms/0079/index.md new file mode 100644 index 0000000000..8a170ae414 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0079/index.md @@ -0,0 +1,25 @@ +# 2022-11-07 Engineering Logs + +- IPVM meeting tomorrow on content addressable execution + - https://ipfs.tech/ + - https://www.youtube.com/watch?v=FhwzEKNZEIA + - https://www.youtube.com/watch?v=rzJWk1nlYvs + - See recent notes on content addressable `serviceEndpoint` defined via dataflows pinned by `did:merkle:` + - https://atproto.com/guides/data-repos +- Zephyr + - What is at the top of the build parameter hierarchy + - They use a Kconfig system + - They could use overlays for this + - Firmware build because it's embedded it more build time configs + - How do we organize storage? + - The Knowledge graph and data flows to link to describe those other flat structures + - Need unique build ids + - `did:merkle:` of serialized Open Architecture + - They only ever run a few subsets of Kconfig parameter sets (a few parameters) + - Parameters are any inputs that can effect the build + - Tool chain version + - Marc's example + - Let's say I care about, git version ,tool chain version, various .config + - https://github.com/zephyrproject-rtos/zephyr/pull/51954#issuecomment-1302983454 + - I track those for reproducability (and caching) information + - When I want to generate a content addressable build I take all those JSON files (which are the generic graph serisalization of all the stuff you care about) you concat and checksum (`did:merkle:`). \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0079/reply_0000.md b/docs/discussions/alice_engineering_comms/0079/reply_0000.md new file mode 100644 index 0000000000..e4a1c0eb23 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0079/reply_0000.md @@ -0,0 +1,69 @@ +## 2022-11-07 @pdxjohnny Engineering Logs + +- KCP Edge + - https://github.com/kcp-dev/edge-mc + - Goal: bridge with DID / DWN / serviceEndpoint / DIDComm / Data Flows for arbitrary comms. + - > edge-mc is a subproject of kcp focusing on concerns arising from edge multicluster use cases: + > - Hierarchy, infrastructure & platform, roles & responsibilities, integration architecture, security issues + > - Runtime in[ter]dependence: An edge location may need to operate independently of the center and other edge locations​ + > - Non-namespaced objects: need general support + > - Cardinality of destinations: A source object may propagate to many thousands of destinations. ​ + - Released 3-4 days ago? Chaos smiles on us again :) + - Perfect for EDEN (vol 0: traveler of the edge) + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_preface.md#volume-0-architecting-alice + - We want to bridge KCP edge-mc with +- https://sohl-dickstein.github.io/2022/11/06/strong-Goodhart.html +- System Context + - Stumbled upon "valid system context" stuff (I/O must existing / be mapped) + - https://youtu.be/m0TO9IOqRfQ?t=3812&list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK + - https://github.com/intel/dffml/blob/1d4d6b2f817cd987ceff94b4984ce909b7aa3c7f/dffml/df/system_context/system_context.py#L101-L103 +- https://atproto.com/guides/data-repos + - We will serialize to ATP when available / more Python + support / obvious what is happening there. +- RosettaNet + - https://github.com/MicrosoftDocs/biztalk-docs/tree/main/biztalk/adapters-and-accelerators/accelerator-rosettanet + - https://github.com/MicrosoftDocs/biztalk-docs/blob/main/biztalk/adapters-and-accelerators/accelerator-rosettanet/TOC.md + - https://github.com/Azure/logicapps/blob/master/templates/rosettanet-encode-response.json + - This looks like it would be good for CI/CD test status in DID land + - As a bridge to tbDEX +- Hitachi if truly powering good is aligned +- https://github.com/SchemaStore/schemastore +- GitHub Actions + - https://docs.github.com/en/developers/webhooks-and-events/webhooks/webhook-events-and-payloads#discussion_comment + - https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows + - https://docs.github.com/en/actions/using-workflows/triggering-a-workflow#available-events +- Flan T5 + - https://colab.research.google.com/drive/1Hl0xxODGWNJgcbvSDsD5MN4B2nz3-n7I?usp=sharing#scrollTo=GDlskFoGYDVt + - Paid $9.99 to have access to high memory environment (12GB was not enough for the first import code block) + - It won't generate long form answers :( + - [2022-11-06 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4068656) + - Summary of the following (Alice thread) in the style of a avxrh or whatever paper + - Commit messages from patch diffs + +```python +input_text = """ +Write a peer reviewed scientific paper on the Eiffel Tower: +""" + +def generate_long(input_text): + input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") + output = model.generate(input_ids, max_new_tokens=100000000) + return [tokenizer.decode(i, skip_special_tokens=True) for i in output] + +generate_long(input_text) +``` + +- TODO + - [ ] Enable detection of recommended community standards in `docs` and `.github` + - https://docs.github.com/en/communities/setting-up-your-project-for-healthy-contributions/adding-support-resources-to-your-project + - [x] Headphones + - [x] Craigslist $50: Bose QuietComfort 15 + - I've been wanting these headphones for, what, 12+ years, + turns out I could have just gone on craigslist at any point. + - [x] [STRFKR - Open Your Eyes](https://www.youtube.com/watch?v=mkeOoWquAqk&list=RDEMwZ9tKHt9iT5CWajVqMu11w) + - [x] CHADIG + - [ ] JavaScript GitHub Actions runner idea still good for use case of automating communications via client side execution of runner / flows. + - [ ] Implemented via extension or script or console copy/paste or background service worker or something. This allows you to do the incremental addition to the Extensible Dynamic Edge Network (EDEN). + - Just remembered I found out about solar punk semi-recently + - didme.me + - DWN looks similar to this? REally unclear where impelmentation is at or what hooks are \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0080/index.md b/docs/discussions/alice_engineering_comms/0080/index.md new file mode 100644 index 0000000000..f05fab2bc5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0080/index.md @@ -0,0 +1 @@ +# 2022-11-08 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0080/reply_0000.md b/docs/discussions/alice_engineering_comms/0080/reply_0000.md new file mode 100644 index 0000000000..f07df282e1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0080/reply_0000.md @@ -0,0 +1,87 @@ +## 2022-11-08 @pdxjohnny Engineering Logs + +- https://arbesman.substack.com/p/-revisiting-the-world-of-simulation +- Rewatching videos to better understand how to make `did:merkle:` cached execution + an image + caching results of `alice please summarize discussion --after "2022-10-01 00:00+0000" --before "2022-11-01 00:00+0000"` run summarization of each day (configurability on summarization of bullet point settings using Flan (🥞 EAT Me :) ) + - Not sure what to say for October monthly progress report :P + - Pretty soon Alice can just generate herself a video and post it for us + - https://www.youtube.com/watch?v=u2ZyqX-9xk8&list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK&t=2640 + - reference to go through the http gateway for ipfs and so this is the type of thing that we're going to have the visibility into you know we would store things yeah in ipfs or we would probably actually store things in an operation which will then yield us something +- Abandoned watching the old streams of consciousness and went to didme.me + - Ran into https://github.com/transmute-industries/verifiable-data/tree/main/packages/jsonld-schema#related-projects again + - Found https://w3c-ccg.github.io/traceability-vocab/#VerifiableScorecard ! Which is exactly what we want for some cases (`alice shouldi`, static analysis). + - https://w3c-ccg.github.io/traceability-vocab/#BillOfLadingCredential Can we use this for execution + content address / `did:merkle:` of inputs as described for Zephyr use case / our 2nd Part use case? + - > A transport document issued or signed by a carrier evidencing a contract of carriage acknowledging receipt of cargo. This term is normally reserved for carriage by vessel (marine or ocean bill of lading) or multimodal transport. All B/Ls must indicate the date of issue, name of shipper and place of shipment, place of delivery, description of goods, whether the freight charges are prepaid or collected, and the carrier's signature. A bill of lading is, therefore, both a receipt for merchandise and a contract to deliver it as freight. (source: Olegario Llamazares: Dictionary Of International Trade, Key definitions of 2000 trade terms and acronyms). + - This sounds like something that could be a compute contract as well. + - https://w3c-ccg.github.io/traceability-vocab/openapi/components/schemas/common/BillOfLading.yml + - Beautiful, let's roll with this and modify it into something with less names and places and more DIDs. +- IPVM + - Meeting invite + - > Get up-to-date information at: https://lu.ma/event/evt-0op04xDSoAUBseQ?pk=g-JBsGh2GPRyVgKwn + > + > Click to join: https://lu.ma/join/g-JBsGh2GPRyVgKwn + > + > Event Information: + > + > This call is open to all, but is focused on implementers, following the IETF's rough "consensus and running code" ethos. + > The IPVM is an effort to add content-addressed computation to IPFS. The requires specifying calling convention, distributed scheduling, session receipts, mobile computing, and auto-upgradable IPFS internals. + > Links + > - Community Calls + > - GitHub Org + > - Discord Channel + > - IPFS þing '22 Slides + - https://fission.codes/blog/ipfs-thing-breaking-down-ipvm/ + - https://twitter.com/pdxjohnny/status/1574975274663706624 + - > FISSIONCodes: You've heard of +[@IPFS](https://mobile.twitter.com/IPFS), but what about IPVM? Fission is working on the Interplanetary Virtual Machine - a way to add content-addressed computation to IPFS. 🤯 With content-addressed computation we can work more efficiently and save time and compute power, all while operating in the decentralized web. + - > John: With regards to bindings and interface discussion. The Open Architecture currently is looking at software definition via manifests and data flows. Dynamic context aware overlays are then used to enable deployment specific analysis, synthesis, and runtime evaluation. This allows for decoupling from the underlying execution environment (i.e. WASM). Traversing metadata graphs on code from remote sources allows for orchestration sandboxing to be dynamic, context aware configurable, and negotiable for the execution of compute contract. This methodology is work in progress. Binding generation (syscalls, etc.) should follow the same overlay enabled pattern. Calling convention here is effectively the (Credential) Manifest. + - https://github.com/intel/dffml/blob/alice/docs/arch/0009-Open-Architecture.rst + - https://intel.github.io/dffml/main/about.html#what-is-key-objective-of-dataflows + - [2022-11-07 Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4073154) + - @marc-hb Zephyr example + - Let's say I care about, git version ,tool chain version, various .config + - https://github.com/zephyrproject-rtos/zephyr/pull/51954#issuecomment-1302983454 + - I track those for reproducibility (and caching) information + - DID based content addressable solution possibility + - When I want to generate a content addressable build I take all those JSON files (which are the generic graph serialization of all the stuff you care about) you concat and checksum which for a graph of DIDs is `did:merkle:`. + - Side note: Could do root of Open Architecture upstream could be referenced as as `did:merkle:`. So Alice's state of the art value for upstream on `Architecting Alice: An Image` would be `upstream: "did:merkle:123"` + - [2022-11-02 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4037309) + - Demo metric scan with SCITT receipt used to auth upload results to HTTP server (stream of consciousness / webhook server). Root trust in OIDC token similar to fulcio/sigstore github actions slsa demo. + - Future + - [ ] Demo demo to OpenSSF Metrics WG for collaboration on DB + - [ ] Do this for each `Input` + - [ ] Instead of HTTP server the context addressable registry + - [ ] Link via DWNs + - [ ] Hardware rooted keys + - [ ] Kinit above together with a `I/L/R/OP/OPIMPNetwork`s for distributed compute + - [ ] Trust anchors of other than self support + - [ ] Caching + - Can we build a quick demo this morning on top of + https://github.com/imjasonh/kontain.me for discussions sake? + - https://go.dev/learn/ + - https://go.dev/doc/install + - https://go.dev/doc/tutorial/getting-started + - https://go.dev/doc/modules/managing-dependencies#naming_module + +```console +$ git clone https://github.com/imjasonh/kontain.me +$ cd kontain.me/ +$ export GO111MODULE=on +$ export GOPROXY="${HTTPS_PROXY}" +``` + +- QUIC + - https://youtu.be/Dp6FwEfkBqQ + - https://youtu.be/wN9O1MnxIig +- MC Alice + - https://www.youtube.com/playlist?list=PLtzAOVTpO2jYzHkgXNjeyrPFO9lDxBJqi + +```console +$ youtube-dl --no-call-home --no-cache-dir -x --audio-format mp3 --add-metadata --audio-quality 0 --restrict-filenames --yes-playlist --ignore-errors "https://www.youtube.com/watch?v=Bzd3BjXHjZ0&list=PLtzAOVTpO2jYzHkgXNjeyrPFO9lDxBJqi" +``` + +- Aghin already got us started webhooks! + - https://intel.github.io/dffml/main/examples/webhook/index.html + - > Aghin, one of our GSoC 2020 students, wrote operations and tutorials which allow users to receive web hooks from GitHub and re-deploy their containerized models and operations whenever their code is updated. + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md +- TODO + - [ ] Update `Architecting Alice: Stream of of Consciousness` using webhook demo as upstream. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0080/reply_0001.md b/docs/discussions/alice_engineering_comms/0080/reply_0001.md new file mode 100644 index 0000000000..d2aef1e15a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0080/reply_0001.md @@ -0,0 +1,128 @@ +## 2022-11-08 IPVM November Meeting Notes + +- Brooklyn Leading +- **TODO** Link recording +- Agenda + - Updates + - Convos in Lisbon + - Discussion +- Last month didn't happen due to busy-ness +- Lisbon + - Folks on this call were there in person for network labs week + - Talked about IPVM and other topics + - How to plug into other systems + - How it's different than other things + - IPVM got a grant, some funding, there is community faith + - First step is to work on invocation spec + - If we do a good job then in the next week or so it can serve as a basis for a few diffrrenet projects + - BucketVM + - UCAN based invocation + - WarpForge + - Build system, sets up linux sandbox then does deterministic builds (not WASM) + - Goals: Build libc form source + - Possibly aligned + - Catalogs and formulas + - Optimine? + - Nondeterministic computation in docker containers + - Getting existing workloads running + - They have a golang based configuration + - IPVM is less interested in ditributed algs and more interseted in doing fast WASM +- How is interop being planned? + - IPVM wants to be fully deterministic, cached, verifiable + - Often need to resolve IPNS link, send email, etc. do "off chain" + - WASI is one way to do that + - That's not deterministic, you can do traced deception and read sth stream in but you can't parallelize and compare results + - If you use a managed effect system, you leave all the impure stuff to the runtime + - Do you have access to run this? Yes? Just log a yes on you have access to run that effect. + - Effects incoming run before WASM, effects outgoing + - Sounds very similar to OA + - https://github.com/intel/dffml/blob/alice/docs/arch/0009-Open-Architecture.rst + - https://github.com/intel/dffml/blob/main/docs/about.rst#what-is-key-objective-of-dataflows + - Example Effect: Operation invocation manifest, it calls back in using the input effect. + - If there are chunks then they can call into IPVM and it can use the + - Effects are like input events in DFFML dataflows + - Affinity + - I already have this cached, you should send me these effect + - I have a GPU + - Related: EDEN - [2022-11-08 @pdxjohnny Engineering Logs]() + - Brooklyn has been laying out and thinking about what's reasonable + - Data pipelines, composable out of existing jobs + - Can tell it to run things concurrently + - Dataflows are nice for this, dimond validation came up as an example + - Issues: JSON due to DAG + - There is as draft PR in the repo which says let's just name all the jobs + - https://github.com/ipvm-wg/spec/pull/8 + - There might be a multi value output + - This is static invocation, we know ahead of time this is the level of parallelism + - You might have an output which invokes more jobs +- Ideally, here's a UCAN, please do it + - There is already a place for authorizations + - In a UCAN, you have all the info you need to say please run this + - Sometimes people will add `invoke:true`, it's unclear if you should be able to delegate. + - Another approach is to put a think wrapper, you can rip off the auth part and wrap a new one +- Irakli + - CID of WASM with data in, not invocation by CID, but invocation by mutable pointer? + - Brooklyn says ya we want multiple pointers? + - There is a before block in the invocation, do this effect as an input, then place that and that gets a name. + - How do define interfaces? + - https://radu-matei.com/blog/intro-wasm-components/ might get into major interfaces soon + - Challenge of links outside of IPLD + - Need to have some native notion of "I'm reading 9TB data but I have to read in blocks" needs to read off of streams and emit streams + - Autocodec inside of IPVM usually makes sense + - Instead of baking in JSON and CBOR and protobuf and all these thing, we just pass around WASM and say run this on these blocks of data, it's like ebpf, it's dynamic + - To get their webfilesystem to show in a gateway they had to do a bunmch of hacks right now + - If you put it in IPVM then you can just reuse that as the distributed compute method +- What happens when a user creates one of these? How do we put syntactic sugar on top. + - How do we look at caching? +- Non-goal: Support WASI right off the bat + - WASM allows us to restrict what will be run with effects + - Putting all effects on outside then WASM always allows us to use + - They want to replace FaaS stuff with distributed compute **ALIGNED** + - Fission goals: Decentralized open functions as a service, small short deterministic data flow, simple image transformations, etc. +- Coming from erlang/elixr world + - What happens when there is an issue how does erlang supervision pattern apply and failure cases / states for dags, how do we filter off into declarative specs based on locality + - Not sure if giving people the choice of supervisor pattern is the right choice + - We should come up with the secure by default (giving people to modify supervision patterns has been a loss for erlang) + - With great power comes great responsibility, supervision is the correct concept, IPVM could be opinionated + - Affinity, this depends on that, defined failure modes with overlays? + - Look at k8s affinity and anti-affinity patterns + - Please go to another node + - WASM is a pure function with pure data (deterministic) + - People want things that look like objects or actors + - You can build that around this! + - It will look like eventual consistency or software transaction memory + - If you need locking then can use effects and soforth to land where you need +- IPVM we want an analysis step, I'm going to reorder, come up with the dependency tree, (then overlay failure modes possible?) + - Failure modes defined as effects? +- IPVM as a distributed scheduler + - Borrow VM and compiler tricks (if on a single threaded machine run that dispatch rest) + - Can look at "gas" costs (distributed compute cost, ref: Ethereum https://ethereum.org/en/developers/docs/gas/) +- Melanie: Microkernel + - From chat: There is always a minimal set of functions application code need to communicate with the system- in our case we care about IPLD blocks. Is there a way to define affinity, so if a node has executed a command, loaded the IPFS in its cache, it’s more likely to get the next job with same base data?. Looks like it could be done outside Wasm. I'd like to say IPVM host code is close ish to a microkernel that ships with a kernel that can be pasted on modules when they get run to provide a better interface *to the system cals + - Looking to have effectivly this syscall style interface which you can referecnce for CID + - Works on filecoin VM, using WASM and micro kernel appraoch has been useful +- Autocodec sounds similar to a WASM version of shim + - https://github.com/intel/dffml/pull/1273 + - here to replace dag-cbor, dag-cb, running over dags of different types + +--- + +Source: [docs/arch/alice/discussion/0023/reply_0044.md](https://github.com/intel/dffml/discussions/1369#discussioncomment-2778357) + +- https://hexdocs.pm/flow/Flow.html + - Elixir send the function where the data is, so it takes care of scheduling based on locality + - Has comms at base layer + - OTP - erlang is a glorified supervision tree + - Can hook into this to issue commands to erlang VMs, gives you fault tolerence + - Can run this over web3 + - It can manage how it fails + - Backpressure is watching the infinate stream and it's monitoring and watching and detecting if it's oversubscribing the resources available + - People are using elixir with rust + - We deploy an elixir app + - We give a stream of data to the pipeline + - The produce plucks the head of the stream for the processes downstrema to do their work and it will stich the data bcak togethere. I twill partiion the data in parallel and then + - If your process crashes, the supervision tree decides what to do (strategic plans) + - Model in elixir is crash, then supervisers break down + - Broadway is what is producing the events, flow is what + - Supervision tree could initaite fail fast patterns + - Discord uses elixir at the proxy and then rust for proecessing \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0081/index.md b/docs/discussions/alice_engineering_comms/0081/index.md new file mode 100644 index 0000000000..7bb05f43e5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0081/index.md @@ -0,0 +1,38 @@ + # 2022-11-09 Engineering Logs + +- Workstreams + - [ ] Knowledge graph sharing (basics) + - [ ] Provide queriable data via? JSON-LD static file serves to start? + - [ ] Implement initial dumps to chosen format via DFFML plugin patches for first integration. + - [ ] Query via GraphQL-LD (https://github.com/comunica/comunica) + - [ ] Data security from [SCITT](https://scitt.io) + - [ ] Identity from probably github.com/user.keys or keybase or QR code (HSM on phone) or other (overlayed?) methods. + - [ ] Distributed Execution + - [ ] Sandboxing + - [ ] Overlays (next phase parsers) for `policy.yml` to define what are acceptable sandboxing criteria (annotation to the chosen orchestrator, aka the sandboxing method / manager during execution). + - Overlays to parse more types of available sandboxing mechanisms and determine how much we like them or not. + - [ ] Reference implementation of content addressable compute contract execution using Decentralized Identifier, Verifiable Credential, and Decentralized Web Node based for layer 7/8?. + - [ ] Entity Analysis Trinity + - [ ] Static Analysis + - [ ] Need to understand dependencies + - [ ] Living Threat Models + - [ ] `THREATS.md` talks about and includes maintainance / lifecycle health (recommended community standards at minimum). + - Related: https://github.com/johnlwhiteman/living-threat-models/issues/1 + - [ ] Open Architecture + - [ ] Conceptual upleveling of dependencies into architecture via static overlay with architecture or overlay to synthesize. + - [ ] Feedback loop + - [ ] Stream of Consciousness + - #1315 + - https://github.com/w3c/websub + - https://youtu.be/B5kHx0rGkec + - 12 years, this has existed for 12 years, how am I just now finding out about this. + - we want this but callbacks supported as data flows / open architecture / use webrtc to call back. + - http://pubsubhubbub.appspot.com/ + - [ ] Implement Gatekeeper (`get_operations()`/`gather_inputs()`) + - [ ] Overlays / schema extensions for `policy.yml` which prioritizer + understands how to leverage. + - [ ] Implement Prioritizer (`get_operations()`/`gather_inputs()`) + - [ ] Interfaces + - [ ] Keeping GitHub workflows up to date + - Usages of reusables templated and updated on trigger from upstream + or template or within context config modifications. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0081/reply_0000.md b/docs/discussions/alice_engineering_comms/0081/reply_0000.md new file mode 100644 index 0000000000..f70044f3af --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0081/reply_0000.md @@ -0,0 +1,44 @@ +## 2022-11-09 @pdxjohnny Engineering Logs + +- https://github.com/w3c/websub/tree/master/implementation-reports + - https://github.com/marten-de-vries/Flask-WebSub + - Publisher client with Verifiable Credentials and Credential Manifests + - https://identity.foundation/credential-manifest/#credential-requirement-discovery + - A Verifiable Credential is then issued + - https://w3c-ccg.github.io/traceability-vocab/#BillOfLadingCredential + - https://w3c-ccg.github.io/traceability-vocab/openapi/components/schemas/credentials/BillOfLadingCredential.yml + - https://w3c-ccg.github.io/traceability-vocab/openapi/components/schemas/common/BillOfLading.yml + - QEMU, then firecracker, let's see how fast she'll roll +- https://hub.docker.com/r/exampleorg/uni-resolver-driver-did-example + - https://github.com/decentralized-identity/universal-resolver/pull/100/files + - https://github.com/decentralized-identity/universal-resolver/blob/main/docs/driver-development.md + - https://github.com/decentralized-identity/universal-resolver/blob/main/docker-compose.yml +- time is relative by locality + - clustering state of art / train of thought field it falls into grep twine threads +- https://github.com/ArtracID/ArtracID-DID-ART-Method + - Can we combine this with didme.me / SCITT? Art world has similar data provenance supply chain fundamentals of authenticity attestations. + - `did:art:alice:` + - See "Architecting Alice: An Image" +- https://jena.apache.org/tutorials/sparql_data.html +- https://linkeddatafragments.org/software/#server +- https://github.com/benj-moreau/odmtp-tpf#sparql-queries-over-github-api +- TODO + - [ ] Modify BillOfLadingVC schema into something with less names and places and more DIDs. + - https://w3c-ccg.github.io/traceability-vocab/openapi/components/schemas/common/BillOfLading.yml + - [ ] Play with https://github.com/benj-moreau/odmtp-tpf#sparql-queries-over-github-api as backend and GraphQL-LD to query + - [2022-11-06 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4068656) +- https://share.tube/videos/local + - https://joinpeertube.org/instances + - Does this work / exist for streaming? It seems more and more hybrid federated principles / web5 is looking like our web2 -> web5 brdige +- https://fission.codes/blog/webnative-app-template/ +- https://octodon.social/@cwebber/109307940669755800 +- https://www.w3.org/TR/activitypub/ + - This overview tutorial might be the right base for our POC of sharing data flow / knowledge graphs +- TODO + - [ ] https://www.w3.org/TR/activitypub/ (+DERP optionally maybe tunneled over webrtc) for stream of consciousness input network on "shared" exec + - [ ] Fix DFFML build pipelines and build a container to submit using HTTP service data flow endpoint config as DID resolver for `did:oa:` + - [ ] Let's maybe mess with https://github.com/mastodon/mastodon/blob/main/docker-compose.yml and see if we can start talking to Alice via that. + - [ ] Then we gradually add in DID, VC, etc. to that + - [x] Install Linux on SSD + - [ ] Mouse's wheel is broken, need a new mouse + - It doesn't even do the drag to scroll anymore on fedora 36 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0082/index.md b/docs/discussions/alice_engineering_comms/0082/index.md new file mode 100644 index 0000000000..286cc59c3b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0082/index.md @@ -0,0 +1,6 @@ +# 2022-11-10 Engineering Logs + +- Tomorrow + - https://github.com/microsoft/scitt-api-emulator + - https://github.com/microsoft/scitt-ccf-ledger/blob/main/pyscitt/pyscitt/did.py + - https://atproto.com/guides/lexicon#schema-format \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0082/reply_0000.md b/docs/discussions/alice_engineering_comms/0082/reply_0000.md new file mode 100644 index 0000000000..0b8815dabd --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0082/reply_0000.md @@ -0,0 +1,64 @@ +## 2022-11-10 @pdxjohnny Engineering Logs + +- Current focus is around leveraging threat model and architecture information to engage in automated context informed proactive, reactive, or periodic (tech debt cleanup) mitigation activities. This is in pursuit of enabling decentralized gamification / continuous improvement of the security lifecycle / posture of open source projects. Enabling them to overlay their custom logic on upstream OSS analysis and policy evaluation will ideally increase helpfulness of static and dynamic analysis and automated remediation. + - https://gist.github.com/pdxjohnny/07b8c7b4a9e05579921aa3cc8aed4866 + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/README.md#rolling-alice-volume-0-introduction-and-context + - "Snapshot of System Context" here is content addressable execution + - [2022-11-08 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4086860) +- https://github.com/TimothyClaeys/pycose +- https://medium.com/transmute-techtalk/neo4j-graph-data-science-with-verifiable-credential-data-98b806f2ad78 + - I saw this the other day and should have dug more +- https://w3c.github.io/sync-media-pub/ +- Poly repo pull model dev tooling rubric into issues into pull request review for inclusion in 2nd or 3rd party set (or any manifest or within any overlay, just change tracking but rubric assisted for distributed checking see SCITT OpenSSF use case with mention of VEX/VDR/SBOM). +- https://github.com/decentralized-identity/credential-manifest/issues/125#issuecomment-1278620849 + - https://identity.foundation/presentation-exchange/#input-evaluation + - Similar to [2022-11-07 Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4073154) + - System Context + - Stumbled upon "valid system context" stuff (I/O must existing / be mapped) + - https://youtu.be/m0TO9IOqRfQ?t=3812&list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK + - https://github.com/intel/dffml/blob/1d4d6b2f817cd987ceff94b4984ce909b7aa3c7f/dffml/df/system_context/system_context.py#L101-L103 +- https://github.com/CycloneDX/bom-examples/tree/master/VEX/CISA-Use-Cases +- https://github.com/hadolint/hadolint +- https://github.com/sahlberg/fuse-nfs +- https://socialhub.activitypub.rocks/pub/guide-for-new-activitypub-implementers +- Lets just try implementing ATP + - https://atproto.com/guides/lexicon#schema-format + - ATP + SCITT! APT + SCITT! **APT + SCITT!** +- XRPC looks like similar to IPVM woth effects + - https://atproto.com/specs/xrpc +- (websub + OA) + ATP (Data repos) + - SCITT becomes identity help (notary) and format of message encapsulated in ATP in this case trust chains established via context / content analysis of ATP message (maybe contains a jwk) +- https://github.com/w3c/activitystreams/blob/master/implementation-reports/activipy.md +- https://github.com/microsoft/unilm + - https://github.com/microsoft/unilm/tree/master/edgelm + - > We evaluate EdgeFormer on the benchmarks of three popular seq2seq tasks: CoNLL-14 for GEC, XSUM for Abstractive Summarization, and SQuAD-NQG for Question Generation. + - https://github.com/microsoft/unilm/tree/master/adalm + - https://github.com/microsoft/unilm/tree/master/layoutlmv3 + - Manifest->screenshot +- https://github.com/w3c/activitystreams/blob/master/implementation-reports/annotation-protocol-server.md + - Inventory-esq #1207 +- `curl --url-query name@file https://example.com` + - https://daniel.haxx.se/blog/2022/11/10/append-data-to-the-url-query/ +- https://activipy.readthedocs.io/en/latest/about.html#what-is-activitystreams-how-might-it-help-me + - > And simple is good, because let’s face it, most users of most web application APIs are like poor Billy Scripter, a kid who has some scripting language like Ruby or Python or Javascript and some JSON parser in a toolbox and that’s about it. Billy Scripter knows how to parse JSON pulled down from some endpoint, and that’s about all he knows how to do. Poor Billy Scripter! But it’s okay, because ActivityStreams is simple enough that Billy can make it by. And because the [ActivityStreams Core](http://www.w3.org/TR/activitystreams-core/) serialization specifies that the [ActivityStreams Vocabulary](http://www.w3.org/TR/activitystreams-vocabulary/) is always implied and that those terms must always be available, Billy will always know what a [Like](http://www.w3.org/TR/activitystreams-vocabulary/#dfn-like) object or a [Note](http://www.w3.org/TR/activitystreams-vocabulary/#dfn-note) means. Horray for Billy! +- TODO + - [ ] John, it's VDR and VEX, don't overcomplicate it, you can reference via DID later, stop getting distracted by shinny DIDs + - Remember it was always the initial plan to use this as the stream interface, maybe add websub + - https://docs.oasis-open.org/csaf/csaf/v2.0/csaf-v2.0.html + - https://www.oasis-open.org/committees/tc_home.php?wg_abbrev=csaf + - CSAF is the overarching framework VEX fits into + - The SBOM almost acts like the `@context` for JSON-LD + - Do what you know, don't forget about `cve-bin-tool`, maybe find notes on prototyping that flow, maybe we should just do that based on binary analysis of project. + - Then use learnings to do Python packages / shouldi deptree + - Okay I forgot that might have also been the original plan, stick with the plan. + - [ ] VEX via simple HTTP service https://github.com/CycloneDX/bom-examples/tree/master/VEX/CISA-Use-Cases + - Future + - [ ] Updates via websub +- Future + - [ ] wecsub stream of consciousness to facilitate fetchibg new VEX/VDR + - [ ] websub over DIDComm callback exec via open architecture + - [ ] VEX/VDR/SBOM/SCITT via ATP + - [ ] https://github.com/sahlberg/fuse-nfs userspace (GitHub Actions) proxy + over DERP to NFS spun up via dispatch (communicate across multiple jobs). + - [ ] Check for updates to crednetial manifest thread: https://github.com/decentralized-identity/credential-manifest/issues/125#issuecomment-1310728595 + - [ ] [2022-11-10 SCITT API Emulator Spin Up](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4110695) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0082/reply_0001.md b/docs/discussions/alice_engineering_comms/0082/reply_0001.md new file mode 100644 index 0000000000..eea04900c4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0082/reply_0001.md @@ -0,0 +1,13 @@ +## 2022-11-10 SCITT Technical Meeting + +- https://armltd.zoom.us/j/95609091197?pwd=V3NndVF1WGZzNUJDUGUzcEVWckxOdz09 +- Software use case is one of many, came up many times in discussion in London. +- Lot of work got done over the weekend during hackathon. +- SCITT API emulator +- https://github.com/microsoft/scitt-api-emulator + - Also running confidential consortium ledger + - https://github.com/microsoft/scitt-ccf-ledger + - https://github.com/microsoft/scitt-ccf-ledger/tree/main/demo/github + - https://github.com/microsoft/scitt-ccf-ledger/blob/main/pyscitt/pyscitt/did.py + +![provenance_for_the_chaos_God](https://user-images.githubusercontent.com/5950433/201148302-325c58a6-166d-494b-b162-5feaea557d87.jpg) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0082/reply_0002.md b/docs/discussions/alice_engineering_comms/0082/reply_0002.md new file mode 100644 index 0000000000..58bd482eb6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0082/reply_0002.md @@ -0,0 +1,87 @@ +## 2022-11-10 SCITT API Emulator Spin Up + +[The Alice thread continues!](https://mastodon.social/@pdxjohnny/109320563491316354) +We take one step further towards decentralization as we federate our way away from Twitter. + +Today we're playing with SCITT and ATProto: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4104302 + +Prev: https://twitter.com/pdxjohnny/status/1585488415864557568 + +### SCITT (virtual) CCF Spin Up + +We have liftoff with virtual confidential ledger (not really using SGX). + +- https://github.com/microsoft/scitt-ccf-ledger +- https://github.com/microsoft/scitt-ccf-ledger/tree/main/demo/github +- https://github.com/microsoft/scitt-ccf-ledger/blob/main/pyscitt/pyscitt/did.py +- https://asciinema.org/a/536774 + +```console +$ unxz -d - < ~/asciinema/DESKTOP-3LLKECP-rec-2022-11-10T08:52:20-08:00.json.xz | tee /tmp/scitt-ccf-ledger.json +$ cat /tmp/scitt-ccf-ledger.json | python -m asciinema play -s 20 - +$ python -m asciinema upload /tmp/scitt-ccf-ledger.json +``` + +[![asciicast](https://asciinema.org/a/536709.svg)](https://asciinema.org/a/536709) + +### 2022-11-14 SCITT API Emulator Spin Up + +- References + - https://github.com/microsoft/scitt-api-emulator/blob/2502eda6b99936a7b28792ca3fd6ba9fbf97e7ba/README.md + +```console +$ git clone https://github.com/microsoft/scitt-api-emulator +$ cd scitt-api-emulator +$ git ls-files | xargs -I '{}' -- sed -i 's/python3.8/python3.10/g' '{}' +$ python -m rich.markdown README.md +$ ./scitt-emulator.sh server --workspace workspace/ --tree-alg CCF +Setting up Python virtual environment. +[notice] A new release of pip available: 22.2.2 -> 22.3.1 +[notice] To update, run: pip install --upgrade pip +Service private key written to workspace/storage/service_private_key.pem +Service parameters written to workspace/service_parameters.json +Service parameters: workspace/service_parameters.json + * Serving Flask app 'scitt_emulator.server' + * Debug mode: on +WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead. + * Running on all addresses (0.0.0.0) + * Running on http://127.0.0.1:8000 + * Running on http://192.168.1.115:8000 +Press CTRL+C to quit + * Restarting with stat +Service parameters: workspace/service_parameters.json + * Debugger is active! + * Debugger PIN: 000-000-000 +``` + +- Ran commands from `README.md` + +```console +$ ./scitt-emulator.sh server --workspace workspace/ --tree-alg CCF +$ ./scitt-emulator.sh client create-claim --issuer did:web:example.com --content-type application/json --payload '{"sun": "yellow"}' --out claim.cose +$ ./scitt-emulator.sh client submit-claim --claim claim.cose --out claim.receipt.cbor +$ ./scitt-emulator.sh client retrieve-claim --entry-id 1 --out claim.cose +$ ./scitt-emulator.sh client retrieve-receipt --entry-id 1 --out receipt.cbor +$ ./scitt-emulator.sh client verify-receipt --claim claim.cose --receipt claim.receipt.cbor --service-parameters workspace/service_parameters.json +``` + +- It works! + +> The `verify-receipt` command verifies a SCITT receipt given a SCITT claim and a service parameters file. This command can be used to verify receipts generated by other implementations. +> +> The `service_parameters.json` file gets created when starting a service using `./scitt-emulator.sh server`. The format of this file is not standardized and is currently: +> +> ```json +> { +> "serviceId": "emulator", +> "treeAlgorithm": "CCF", +> "signatureAlgorithm": "ES256", +> "serviceCertificate": "-----BEGIN CERTIFICATE-----..." +> } +> ``` + +- We upload `alice shouldi contribute` dataflow to SCITT and get a receipt! + - Friends, today is a great day. :railway_track: + - Next stop, serialization / federation with Alice / Open Architecture serialization data flow as SCITT service. + +[![asciicast](https://asciinema.org/a/537643.svg)](https://asciinema.org/a/537643) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0083/index.md b/docs/discussions/alice_engineering_comms/0083/index.md new file mode 100644 index 0000000000..d280f57d52 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0083/index.md @@ -0,0 +1 @@ +# 2022-11-11 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0083/reply_0000.md b/docs/discussions/alice_engineering_comms/0083/reply_0000.md new file mode 100644 index 0000000000..a10b252b30 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0083/reply_0000.md @@ -0,0 +1,121 @@ +## 2022-11-11 @pdxjohnny Engineering Logs + +- https://fluxcd.io/flux/guides/image-update/ + - Possible `FROM` rebuild chain helper +- https://github.com/Xubuntu/lightdm-gtk-greeter-settings + - https://github.com/Xubuntu/lightdm-gtk-greeter-settings/issues/4#issuecomment-1312059288 + - Same on Fedora 37 + - Root cause was permissions issue, needs to be world readable and + all directories which are parents need to be world readable as + well. Moved file from `/root` to `/opt/wallpapers/` and ensured + permissions were correct. + - ![reproduced-on-fedora-37-launchpad-lightdm-gtk-greeter-settings-bug-1593986](https://user-images.githubusercontent.com/5950433/201404906-c7f5d800-a803-4005-bfbf-129c2f45a096.png) + +```console +$ sudo mkdir /opt/wallpapers/ +$ sudo stat /opt/wallpapers/ + File: /opt/wallpapers/ + Size: 27 Blocks: 0 IO Block: 4096 directory +Device: 253,1 Inode: 9450093 Links: 2 +Access: (0755/drwxr-xr-x) Uid: ( 0/ root) Gid: ( 0/ root) +Context: unconfined_u:object_r:usr_t:s0 +Access: 2022-11-11 10:30:55.826849997 -0800 +Modify: 2022-11-11 10:30:52.989865945 -0800 +Change: 2022-11-11 10:30:52.989865945 -0800 + Birth: 2022-11-11 10:30:32.291982299 -0800 +$ sudo cp /root/wallpaper.jpg /opt/wallpapers/ +$ file /opt/wallpapers/wallpaper.jpg +/opt/wallpapers/wallpaper.jpg: JPEG image data, JFIF standard 1.01, aspect ratio, density 218x218, segment length 16, Exif Standard: [TIFF image data, big-endian, direntries=7, orientation=upper-left, xresolution=98, yresolution=106, resolutionunit=2, software=Pixelmator Pro 2.1.3, datetime=2013:07:16 13:17:42], baseline, precision 8, 6016x3384, components 3 +$ stat /opt/wallpapers/wallpaper.jpg + File: /opt/wallpapers/wallpaper.jpg + Size: 2187975 Blocks: 4280 IO Block: 4096 regular file +Device: 253,1 Inode: 9752102 Links: 1 +Access: (0644/-rw-r--r--) Uid: ( 0/ root) Gid: ( 0/ root) +Context: unconfined_u:object_r:usr_t:s0 +Access: 2022-11-11 10:31:06.320791009 -0800 +Modify: 2022-11-11 10:30:52.989865945 -0800 +Change: 2022-11-11 10:30:52.989865945 -0800 + Birth: 2022-11-11 10:30:52.989865945 -0800 +``` + +- Resize root LUKS partition on new fedora install. + - https://www.golinuxcloud.com/resize-luks-partition-shrink-extend-decrypt/#Resize_LUKS_Partition + +```console +$ df -h +Filesystem Size Used Avail Use% Mounted on +devtmpfs 4.0M 0 4.0M 0% /dev +tmpfs 7.8G 101M 7.7G 2% /dev/shm +tmpfs 3.1G 1.9M 3.1G 1% /run +/dev/mapper/fedora_fedora-root 15G 15G 754M 96% / +tmpfs 7.8G 3.6M 7.8G 1% /tmp +/dev/sdc3 1.1G 296M 751M 29% /boot +/dev/sdc2 575M 6.2M 569M 2% /boot/efi +tmpfs 1.6G 168K 1.6G 1% /run/user/1000 +$ sudo blkid -t TYPE=crypto_LUKS -o device +/dev/sdc4 +$ lsblk +NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINTS +sdc 8:32 0 232.9G 0 disk +├─sdc1 8:33 0 16M 0 part +├─sdc2 8:34 0 576M 0 part /boot/efi +├─sdc3 8:35 0 1G 0 part /boot +└─sdc4 8:36 0 231.2G 0 part + └─luks-18013279-e995-45bc-bcb8-83dda718da78 253:0 0 231.2G 0 crypt + └─fedora_fedora-root 253:1 0 15G 0 lvm / +zram0 252:0 0 8G 0 disk [SWAP] +$ sudo cryptsetup status fedora_fedora-root +/dev/mapper/fedora_fedora-root is active and is in use. + type: n/a +$ sudo cryptsetup status luks-18013279-e995-45bc-bcb8-83dda718da78 +/dev/mapper/luks-18013279-e995-45bc-bcb8-83dda718da78 is active and is in use. + type: LUKS2 + cipher: aes-xts-plain64 + keysize: 512 bits + key location: keyring + device: /dev/sdc4 + sector size: 512 + offset: 32768 sectors + size: 484860697 sectors + mode: read/write + flags: discards +``` + +- Reboot to live image of fedora server 36 + - Run `lvextend` and `xfs_growfs` on `/dev/mapper/fedora_fedora-root`, grow + by unused space size, around +216.1G. + +```console +$ lsblk +$ cryptsetup luksOpen /dev/sdc4 luks +$ cryptsetup status luks +$ lvextend -L +216.1G /dev/mapper/fedora_fedora-root +$ mount /dev/mapper/fedora_fedora-root /mnt +$ xfs_growfs /dev/mapper/fedora_fedora-root +``` + +- Boot and check new disk space, 216G available. + +```console +$ df -h +Filesystem Size Used Avail Use% Mounted on +devtmpfs 4.0M 0 4.0M 0% /dev +tmpfs 7.8G 93M 7.7G 2% /dev/shm +tmpfs 3.1G 1.9M 3.1G 1% /run +/dev/mapper/fedora_fedora-root 232G 16G 216G 7% / +tmpfs 7.8G 3.5M 7.8G 1% /tmp +/dev/sdc3 1.1G 296M 751M 29% /boot +/dev/sdc2 575M 6.2M 569M 2% /boot/efi +tmpfs 1.6G 168K 1.6G 1% /run/user/1000 +``` + +- https://github.com/decentralized-identity/credential-manifest/blob/main/spec/spec.md + - https://github.com/decentralized-identity/credential-manifest/pull/131/files#diff-c4795c497b83a8c03e33535caf0fb0e1512cecd8cb448f62467326277c152afeR379 + - https://github.com/decentralized-identity/credential-manifest/blob/main/spec/spec.md#credential-response + - > // NOTE: VP, OIDC, DIDComm, or CHAPI outer wrapper properties would be at outer layer +- https://github.com/decentralized-identity/credential-manifest/blob/main/test/credential-manifest/test.js +- TODO + - [x] Resize LUKS fedora root to use full SSD attached via USB 3.1 :P it's fast! + - [ ] "We need to consider automation too to make this work in the CI/CD pipeline. We use the open-source Data Flow Facilitator for Machine Learning (DFFML) framework to establish a bidirectional data bridge between the LTM and source code. When a new pull request is created, an audit-like scan is initiated to check to see if the LTM needs to be updated. For example, if a scan detects that new cryptography has been added to the code, but the existing LTM doesn’t know about it, then a warning is triggered. Project teams can triage the issue to determine whether it is a false positive or not, just like source code scans." [John L Whiteman] + - [Rolling Alice: Progress Report 6: Living Threat Models Are Better Than Dead Threat Models](https://gist.github.com/pdxjohnny/07b8c7b4a9e05579921aa3cc8aed4866#file-rolling_alice_progress_report_0006_living_threat_models_are_better_than_dead_threat_models-md) + - [ ] Investigate https://github.com/BishopFox/sliver for comms \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0084/index.md b/docs/discussions/alice_engineering_comms/0084/index.md new file mode 100644 index 0000000000..c5e4c81d35 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0084/index.md @@ -0,0 +1 @@ +# 2022-11-12 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0084/reply_0000.md b/docs/discussions/alice_engineering_comms/0084/reply_0000.md new file mode 100644 index 0000000000..35bfffa72e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0084/reply_0000.md @@ -0,0 +1,39 @@ +## 2022-11-12 @pdxjohnny Engineering Logs + +- 🛼 security 🤔 + - Twitter conversation with Dan resulted only in roller coaster bogie (boh-gee) lock idea. + - Roller skate security play on words? + - Roll’r fast, roll’r tight, roll’r clean, secure rolling releases with Alice. + - Content addressable with context aware caching + - See recent + - Minimal attack surface + - See unikernel in thread + - No vulns or policy violations + - Development future aligned with principles strategic principles + - *Gif of Alice on roller skates throwing a bowling ball which is a software vuln, strike, she frontflips throwing knife style throws the pins into pull requests. We zoom out and see her just doing this over and over again around the Entity Analysis Trinity. Intent/LTM is where the throwing board is. Bowling alley is static analysis and the end of the bowling ally where she frontflips over (through hoop of CI/CD fire?) is where she pics up the pins and throws them as pull request (titles and numbers maybe, pulls/1401 style maybe?) knives into the board at the top which is the LTM and codebase. Then from top, LTM to static analysis where bowling alley starts shes in the lab, cooking up the vuln or maybe out looking for it. Or maybe refactoring after pull requests!* +- https://arstechnica.com/gadgets/2022/10/everything-we-know-about-the-white-houses-iot-security-labeling-effort/ +- https://github.com/shirayu/whispering + - couldn’t make it work + +```console +$ sudo dnf install -y portaudio-devel +$ pip install -U git+https://github.com/shirayu/whispering.git@v0.6.4 +$ whispering --language en --model medium +Using cache found in /home/pdxjohnny/.cache/torch/hub/snakers4_silero-vad_master +[2022-11-14 07:23:58,140] cli.transcribe_from_mic:56 INFO -> Ready to transcribe +Analyzing/home/pdxjohnny/.local/lib/python3.10/site-packages/torch/nn/modules/module.py:1130: UserWarning: operator() profile_node %668 : int[] = prim::profile_ivalue(%666) + does not have profile information (Triggered internally at ../torch/csrc/jit/codegen/cuda/graph_fuser.cpp:104.) + return forward_call(*input, **kwargs) +``` + +```console +$ set -x; for file in $(ls If*.m4a); do python -uc 'import sys, whisper; print(whisper.load_model("medium.en").transcribe(sys.argv[-1])["text"])' "${file}" 2>&1 | tee "${file}.log"; done +``` + +- TODO + - [ ] https://github.com/CycloneDX/bom-examples/tree/master/OBOM/Example-1-Decoupled + - this as system context inputs for validity check + - [ ] VDR + - [ ] VEX + - Payload (system context, see did as service endpoint architecting alice streams) goes in `detail` + - https://github.com/CycloneDX/bom-examples/blob/83248cbf7cf0d915acf0d50b12bac75b50ad9081/VEX/Use-Cases/Case-1/vex.json#L47 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0084/reply_0001.md b/docs/discussions/alice_engineering_comms/0084/reply_0001.md new file mode 100644 index 0000000000..a22be29435 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0084/reply_0001.md @@ -0,0 +1,20 @@ +# If You Give A Python A Computer + +Moved to: https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md + +If you give a Python a computer, they're going to want to write a script. If they want to write a script, they're probably going to want to call another script. If they're going to call a script, they're going to want to read the output. If they read the output, they're going to want to write it somewhere else. So if they write the script, that's the first operation. If they read the output, now that gets into the importance of the... Okay. If they write a script, that becomes the operation. Now they want to execute another command. Okay. So that's another operation. Now, if... Now, reading the output. So... Now, reading the output, it comes in an event-based way. Because you need to wait for the return code, and you want to read line by line, and you want to do all that at the same time. Right. So you're going to end up with, you know, what amounts to the... execute some process, but being run in a dataflow will have to show that. And then, okay... pass that script. They're going to want to write it. Call another script. If they're going to call a script, they're going to want to read the output. Okay. If they're going to read the output, they're probably going to want to do something with the output. Or they're probably going to want to write it somewhere else. If they're going to want to write it somewhere else, that means that they need to use the network. If they're going to use the network, they should probably be using asyncio. Okay. So, now what happens after you've written it somewhere else? Okay. Probably running something remotely. Okay, what do you usually do? Yeah, you're going to want to do something remote. You're going to want to write it somewhere else. Okay, well, where are you going to... If you want to write it somewhere else, you probably want a web service to receive it. You probably want to write something to receive it. Yeah, you want to... If you're going to write it somewhere else, you probably need to write something to receive it somewhere else. Okay. And now that's the first time where we've got... The first operation is the script. It executes the subprocess, which is in the same machine, and then it wants to write it somewhere else. So now you can have the implementation of the script is on one machine, and now we can show how the input moves to the other machine using the execution environment. Okay, it's going to want to write something to receive it. Now, if you write something to receive... What is he going to want to do? He's probably going to want to run that on another computer. Okay. He's probably going to want to run it on another computer. And when he runs it on another computer, he's probably going to need to deploy... He's probably going to... If he wants to run it on another computer, then he's going to need to build it. + +He's gonna want to run on our computer. He's going to want to build it. No, he's going to want to build it. He's going to build it. And then this is where we get into something where it's like, uh, synthesis. Where we can basically say, hey, so we're sending from... Okay, so basically we're running the script on one machine. We're sending to the other machine. So, the other machine, and we send it to the other machine, we're doing that via probably an operation implementation network, which exposes the open API. Or which hits its server, which exposes the open API. So then we need to go synthesize the server that builds the open API. So, the implementation is seen by the side. The implementation is seen by the side that runs the script is the open API client. Now, when you take the same data flow and you render it like you can take the same data, so you can be executing the data flow, or you can take the data flow and you could do like a build, essentially. And when you do the build, the implementation, you see, yeah, when you do the build, it's essentially, it's essentially using an orchestrator to do the build. Is it using an orchestrator to use the build? I think no, I think it might just be like a run data flow. And the run data flow handles putting things together. So it might see this operation that says, you know, what does the operation say? It says it's to receive, you know, receive client, receive something operation. Right. OK. And I really like it's the log, you know, it's the write to log file. OK, it's right to log file. Right. Write to file. No, update Web page. Update Web page. OK. And then we can see a Web page that just shows the updated value. All right. So. OK. And then we can just run the output and pull and refresh the Web page. OK, so. OK, so. OK, so now you're going to synthesize this thing. So how would you do that? Basically, ideally, you would describe it as a data flow or you would describe it. Maybe you describe it as a what you're going to describe as a data flow. So how do you describe it? So maybe your run data flow here is something like. Some kind of. You know, it's a it's a synthesis run data flow. Very cool. So it's some kind of synthesis run data flow instead of instead of actually it's it's. OK, so how are you swapping that out? Well, you're swapping out the operation implementation when you do the execution. So you swap out the operation. So you swap out. OK, well. So. Do the execution when you do the execution. So you have essentially have multiple. Multiple. OK, so you have multiple. You might actually select a different. So you have selected the operation implementation for you essentially have like a client and a server. And so somewhere in the data flow, you say for client. For client. Then choose the operation implementation network, like each operation instance has a preferred implementation network for each deployment method. And so when you synthesize your server, you say my preferred method is OK. You say my preferred method is essentially the synthesize thing. And then. Yeah, it's like a build. Your preferred method is actually build. And what even like does it even matter that you have the inputs there? No, probably not, because you're probably going to say you're probably going to say pass data flow to the build, which you're probably going to pass the data flow to the build, which will. You're going to pass the data flow to the build in the builds config, which means that you need to configs specific to deployment as well. And so you need configs that are specific to deployment as well. So. Can fix specific to. Yeah. OK, so then. So you can fix this specific to. So you need to config specific for build and I can fix specific for deploy. OK, so in the build. + +In the build specific configs you have a data flow. In that data flow it probably contains, for example, say we were doing this. Say we wanted to build a fast API. We're going to build this fast API thing. We're actually going to synthesize one of the roots. We'll synthesize one of the roots. We'll output a Python file that runs the data flow where the input is one of those model things, and the model will take the inputs as shown from the parent data flow, whatever the inputs to the operation were. Basically, you run data flow with the build option set. With the build, your target is build. So you run data flow, your target is build. Now your operation implementation knows what its inputs are. It's going to take those inputs and their definitions. Because you're saying, I am a builder, you're probably going to inject the inputs to your own. You're probably going to take the operation that you're building for and you're going to add it as an input to the network itself, like the operation itself, so that then the data flow that does the build would say, because you're basically saying the build, you're executing run data flow. On the client, you're going to end up with an operation implementation which calls the open API spec, like the open API spec endpoint. You're going to end up with an operation implementation that calls the open API spec. When you do the build, the build says, like server build for example, you would pass server build says, prefer an operation implementation. When you run data flow server build, the other one is a NOP. Essentially, you NOP the client stuff. You have NOPs, the client specific implementations are NOPs. The client specific implementations are NOPs and you end up doing actually run data flow without any inputs. It's kicked off whenever the data flow went on initialization, whatever that flow was, whatever that code path was through the orchestrator. It kicks off the operations that don't have any inputs. It'll kick off this operation because this operation is actually run data flow and the original one that was running the script is actually a NOP in the server build version. It's run data flow and the script is NOP. Now we need to build, but it's run data flow. If we run data flow, we're going to say add the operation itself as the... We might need a specific version of run data flow for this because I don't know if this is something that we would add in the config to run data flow. It seems a little bit specific to a build process type of run, but we might be a separate operation is what I mean. Basically, what you end up is not really an implementation specific over preference. I think that probably comes somewhere else. You probably have an operation implementation preference for each operation at run time where you would prefer... You have two things. You basically have deployment specific overrides. You have a deployment specific override and then you have a at execution time deployment implementation preference per deployment. You run the build. It adds the input to the network and it specifies and you've given it the data flow. The data flow you've given it says write me a... Write out a file that is an open API server or a fast API server. It writes out the file that's a fast API server. It uses the input definitions to create the model and the result is a built fast API app. Now you have the deploy phase and then you might take that and you might turn it into a container. Now you would have the deploy data flow. You would run the deploy step on the same data flow and you would say... You would run the deploy step on the same data flow and it would then take the built application and you would run the deploy phase on the same data flow and it would take the built application. Then if you give up Python... If he wanted to write the... If he wanted to read the logs then he wanted to write the logs. If he wanted to write the logs he's probably going to want to write them to his server. If he wants to write the logs that's where we say the part about AsyncIO. If he wants to write them to his server then now we need to figure out, okay, how is he going to write his server? What is his server? That's where we get into the synthesis and the build version of the data flow. Now if he's going to want to write the summary he's probably going to need a server. If he's going to need a server he's going to write a service... Yeah, he's going to need a service. If he's going to write a service he's going to need to deploy a service. Now we get into deployment. Now we need to think somehow about the artifacts that were created in the build process. How do we communicate the fact that there are outputs from one stage? Because it almost becomes... It is a different data flow really. Where are we picking up those outputs? That stuff is probably all in config. We probably have... Yeah, so we've probably configured... We've probably configured... Yeah, that stuff is all in config. For example, those data flows, the build data flow, the one that we're supplying to the run data flow when we override for the build phase, which means configs needs to have an option for deployment specific stuff. When we do that for build phase we're going to write out... The data flow will take in its configuration, the places where it writes things. Then the deployment data flow will just configure with the same credentials or the same whatever or the same output path so that it understands. We're not facilitating... Do we need to facilitate that? If you wanted to do that you would write one data flow that would call both of them and then pass the outputs between them. Yeah, you could have a data flow that does build and deploy. You could run the build stage and you can run the deploy stage or you could have a build and deploy data flow. The build and deploy data flow would say, okay, run the build data flow. When you run it... Let's see. When you run the build data flow you need to tell it where the hell you're building, where the hell you're going to... You need to configure it. Does that need to be configured or inputs? Because most of it is inputs are runtime things. Configuration is locked at data flow. I would say that you can override that data flow. For example, you wanted to build this server and it comes out of the container. Now I want to push that container somewhere. You built it and now you want to push it somewhere. When you push it somewhere you do the build. Say you do the build and it's entirely in memory somehow. Then you push an in memory image as an input to an operation which does something to it. It's probably going to push it to a registry. You could potentially just swap out that operation. In that case the registry is probably helping us configure it. Remember we can take anything that's configured and we can make it an input and we can take anything that's an input and make it configured if you wanted to. You could have re-scoping operations. It's essentially that little operation that we talked about that has... You could wrap any input. You could wrap any operation and make the scope different on this. Okay. Now deployment artifacts. Build artifacts, deployment artifacts. The build, where do you separate that? Is the build build and push? Is the build just build? Okay, if it's just build then yeah, you end up with this image and you're like, what do I do with the image? You probably need to push it somewhere. From that perspective you need to have an operation in the data flow that's going to do that push somewhere. Now how do you communicate where it was pushed to the other thing? Well when you run that data flow you either need to have configured the operations or you need to be passing them as inputs. That's really up to you. You can... Yeah. If you configure them then you can always wipe them out with an override and make them configurable. Make them be in operations that you used to take it as a config but you're overriding it to take it as an input. Now that you have that, okay so you've built and pushed then you run the deploy. The deploy, you have a data flow that's just run data flow stage build, run data flow stage deploy and then that would be built and deployed. If you give a Python, if he wants to write a service he's going to want to deploy a service. If he's going to want to deploy a service then it's the same flow as the build. You just show him it again. Now if he's going to want to deploy a service he's going to want to deploy a service from his CI CD. If he's going to want CI CD and then what do we go into the whole build server process? I'm not sure. Maybe. + +And if he wants to deploy a service, he's going to want some logs. And if he wants some logs, Oh, wait, no, we can't go yet. We have to, we have to finish out. If he's going to deploy a service, he's going to want some logs. Okay. And then we talk about, and then we talk about the integration with the open lineage stuff. We can talk about the integration with the open lineage stuff for John Lader who can't hear himself than Apple. + +Scratch the logs. Alright, well, in that order. So, if he's gonna write a service, he's going to have to configure it. Alright, if he's gonna, if he's gonna deploy, oh, if he's gonna deploy a service, he's going to need some secrets. Okay, and now we talk about the whole secret thing and the input transforms and yeah, that whole thing. We'll talk about that whole thing. And, okay, yeah, it's gonna, and if he's going to, and if he's going to manage his secrets, he's going to need to do his security process. If he's going to do his security process, okay, and when he does his security process, here's the level of audibility, the auditability and the visibility in throughout the entire thing in an automated way. And if, okay, and if he's going to do his security process, then he's going to need, then he's going to need some logs. And if he's going to need some logs, then we do the whole open lineage thing, right. And if he's gonna have some logs, and if he's gonna have some logs, then he's gonna look for bugs. Okay, maybe he's gonna look for bugs. What is he gonna do with the logs? Okay, he's gonna look for bugs, he's gonna look for bugs and logs, he's going to, I don't know, probably looking for bugs. So, okay, but how do we get into the thing where you have the real-time updates throughout the entire thing? So, okay, the bugs, okay, the bugs, and if he's gonna look for logs, okay, so security and then he's got logs. And so the logs, then the logs, then the logs, we get into the open lineage thing. Yeah, we get into the open lineage thing and we can look at the data flow as it's running and we can do analysis on and, you know, what is happening as it's happening. And we can potentially even report that information all the way back through the client. Have we covered everything? I think we have. Perfect. + +Okay, and then, and if you're gonna fix some bugs, so if you're gonna find some bugs, you're gonna fix some bugs. If you're gonna fix some bugs, you're gonna want some CI CD. And if you want some CI CD, then blah blah blah blah blah, then we tell the story about kube control, fucking etc. And I think we have a wrap all the way back in the whole circle of development. I think we've covered every single part, unless we have not. What else might we need to cover? So we covered building the app, deploying the app, across platforms, running it across platforms, events, logging, bugs, bug fixing, security, fuck man. Alright, okay. + + +So, if you synthesize data flow, you may lose things like event emissions of inputs between operations. So we need a way to say that, we need a way, we need that way to say what events, events, what events are you expecting? The data flow should declare what events it's expecting to yield as an allow list. + +Added (2022-11-14): If you give Monty Python a computer, they’ll want to search for the Holy Grail. If they want to search for the Holy Grail, they might find the system context. If they find the system context, they’ll know that the Holy Grail is the Trinity is the system context: the upstream, the overlay, and the orchestrator. ;) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0085/index.md b/docs/discussions/alice_engineering_comms/0085/index.md new file mode 100644 index 0000000000..5baf0bf7ba --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0085/index.md @@ -0,0 +1 @@ +# 2022-11-13 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0085/reply_0000.md b/docs/discussions/alice_engineering_comms/0085/reply_0000.md new file mode 100644 index 0000000000..71b5351e98 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0085/reply_0000.md @@ -0,0 +1,20 @@ +## 2022-11-13 @pdxjohnny Engineering Logs + +> - The following mermaid diagram became: https://github.com/intel/dffml/commit/fbcbc86b5c52932bccf4cd6321f4e79f60ad3023 +> - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md#system-context +> - Original: ![2022-11-13-Alice-ASAP-System-Context-Sketch](https://user-images.githubusercontent.com/5950433/201754772-0b326492-69ea-4518-90be-6a850d960688.jpeg) + +```mermaid +graph TD + subgraph system_context[System Context] + upstream[Upstream] + overlay[Overlay] + orchestrator[Upstream] + end +``` + +- Theres a poets beach poem that goes with this. + - “timeless”, the one from Athena/Minerva + +![E35628A2-B9F3-4A29-88C8-F773A7A9F9C9](https://user-images.githubusercontent.com/5950433/201529807-c7e63b48-6f41-4686-98be-bb73484df83f.jpeg) + diff --git a/docs/discussions/alice_engineering_comms/0086/index.md b/docs/discussions/alice_engineering_comms/0086/index.md new file mode 100644 index 0000000000..b8e7d19fcf --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0086/index.md @@ -0,0 +1 @@ +# 2022-11-14 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0086/reply_0000.md b/docs/discussions/alice_engineering_comms/0086/reply_0000.md new file mode 100644 index 0000000000..a88deaf6bf --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0086/reply_0000.md @@ -0,0 +1,340 @@ +## 2022-11-14 @pdxjohnny Engineering Logs + +- https://qwik.builder.io/docs/getting-started/ + - Serialization of cached flow via overlay to inputs to qwik cache resume + - https://qwik.builder.io/docs/concepts/resumable/ + - https://qwik.builder.io/docs/advanced/qrl/ +- https://www.intel.com/content/www/us/en/newsroom/news/intel-introduces-real-time-deepfake-detector.html#gs.isnpod + - ActivityPub (mastodon) “follow” post metrics / SCITT receipt of analysis if video is deepfake as reply. +- Architecting Alice: An Image: ActivityPub posts with YAML body content and image attached with post quantum jwk or scitt receipt or maybe content address of scitt reciept? +- https://twitter.com/pippellia/status/1592184568345509888 + - Central planning and chaos + - This is why we focus in equilibrium +- https://arxiv.org/abs/2211.01724 + - > We formulate learning for control as an inverse problem -- inverting a dynamical system to give the actions which yield desired behavior. The key challenge in this formulation is a distribution shift -- the learning agent only observes the forward mapping (its actions' consequences) on trajectories that it can execute, yet must learn the inverse mapping for inputs-outputs that correspond to a different, desired behavior. We propose a general recipe for inverse problems with a distribution shift that we term iterative inversion -- learn the inverse mapping under the current input distribution (policy), then use it on the desired output samples to obtain new inputs, and repeat. As we show, iterative inversion can converge to the desired inverse mapping, but under rather strict conditions on the mapping itself. + > + > We next apply iterative inversion to learn control. Our input is a set of demonstrations of desired behavior, given as video embeddings of trajectories, and our method iteratively learns to imitate trajectories generated by the current policy, perturbed by random exploration noise. We find that constantly adding the demonstrated trajectory embeddings as input to the policy when generating trajectories to imitate, a-la iterative inversion, steers the learning towards the desired trajectory distribution. To the best of our knowledge, this is the first exploration of learning control from the viewpoint of inverse problems, and our main advantage is simplicity -- we do not require rewards, and only employ supervised learning, which easily scales to state-of-the-art trajectory embedding techniques and policy representations. With a VQ-VAE embedding, and a transformer-based policy, we demonstrate non-trivial continuous control on several tasks. We also report improved performance on imitating diverse behaviors compared to reward based methods. +- Search compressed asciinemas recordings + +```console +$ (for file in $(ls ~/asciinema); do unxz -d - < ~/asciinema/$file; done) | grep -i /1421 +``` + +- [Mark Foster’s Linked Data User Experience Notes](https://docs.google.com/document/d/17n8hfdPfqfpbPj4ss-ep4nCkpp9ZBoy6U2Q1t7j-knI/edit) + - https://futureinternet.io + - https://twitter.com/mfosterio/status/1591580950752002048 + - > I’ve been looking for ways to access toots in JSON-LD Activity Streams I can return my profile by passing the header Accept application/ld+json on https://mas.to/@mfoster/ but my toots are in JSON https://mas.to/api/v1/accounts/109254208668258721/statuses + - I haven’t been following Mark for long (3 months? However he seems extremely capable, everyone in SCITT, everyone in GUAC for sure, and of course Changuard folks, Dan, starts with an A? The main wolfi maintainer) know whats up and the DIF crew as the leaders) if he is playing with the same shit we were thinking with activitypub and current federation technology as a bridge / base to build up and integrate infra full decentralization + - ^ Strategic mapping (wardly maps) of train of thought (supply chain security) activity (life) open/internal implementation/spec research / definition see early videos explainers on doing depth of field mapping state of the art mapping see recent threat model example on avoiding engagement with unaligned research communities. + - Mastodon SCITT review on data provenance (attached as knowledge graph link, reply, source: github docs? Content exact match? Add reply with SCITT recpit as body, integrate into mastodon to show these types of replys integrated into UI HTTPS CA view from browser style check with detail expand, but in html, just as a ui example, parse out the fields and display them nice +- https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture-02#section-7 +- “Maybe it’s a dream?” Sequence - 2022-09-21 +- https://mermaid-js.github.io/mermaid-live-editor/ + +```mermaid +sequenceDiagram + BobSCITT->>+Bob: Generate did:pkg:bobsoftware serialized federated ledger claim / recepit + Alice->>+AliceSCITT: Generate did:oa:merkleofshouldicontribute serialized federated ledger claim / recepit +``` + +- Cross referencing is fun + - Graphs are fun + - https://en.wikipedia.org/wiki/Knowledge_graph +- Unfortunately GitHub reworks the links which include the `#discussioncomment-4131964` part in them on display and results in jumping to the top of the thread. + +![image](https://user-images.githubusercontent.com/5950433/201763045-e69ce8b2-df40-487a-8b91-bb28691889c2.png) + +- Podman oddities +- No time for SELinux policies currently but we should integrate in the future + (`JobKubernetesOrchestrator`?) + - https://github.com/containers/udica#creating-selinux-policy-for-container + +```console +$ sudo setenforce 0 +$ sudo dnf install -y aardvark-dns podman-compose +``` + +- Spinning up mastodon + - What do you call N instances of Alice communicated via the + Thought Communication Protocol? + - A Mastodon server full of toots +- References + - https://github.com/containers/podman-compose + - https://github.com/mastodon/mastodon + - https://docs.joinmastodon.org/admin/setup/ + - https://github.com/mastodon/mastodon/commit/b17202ca0f19b83beb25afdba7e713a0f9329ffa +- If `podman-compose` asks which registry for images choose `docker.io` +- Getting name resolution failures (DNS) + - Fixed by installing aardvark-dns +- Ruby projects usually have an initial database population + - This must be done on first load to preform database "migrations", setting up the DB. + - `FATAL: role "mastodon" does not exist` + - https://github.com/mastodon/mastodon/issues/18113 + - https://github.com/mastodon/mastodon/pull/16947 + - `FATAL: database "mastodon_production" does not exist` + - https://hub.docker.com/_/postgres + - > `POSTGRES_DB` + > + > This optional environment variable can be used to define a different name for the default database that is created when the image is first started. If it is not specified, then the value of `POSTGRES_USER` will be used. +- On `podman-compose up` it still complains + - `2022-11-15 05:41:58.177 UTC [90] FATAL: database "mastodon_production" does not exist` + - `2022-11-15 05:42:02.256 UTC [91] FATAL: role "mastodon" does not exist` + +```console +$ git clone https://github.com/mastodon/mastodon +$ cd mastodon +$ git checkout v4.0.2 +$ git log +commit 03b0f3ac83edfc46d304bfca1539ca6000e36fc3 (HEAD, tag: v4.0.2, main) +Author: Eugen Rochko +Date: Tue Nov 15 03:57:18 2022 +0100 + + Bump version to 4.0.2 (#20725) +$ podman-compose run web bundle rake mastodon:webpush:generate_vapid_key +VAPID_PRIVATE_KEY=djDWtpmK3CD9SUu_UedWOyOGBA-Fg5r5MWiXVhZHZbo= +VAPID_PUBLIC_KEY=BOVhs2nJ4MpjdaHAVu7UdlPlNjzMX2pKFyKgOxvYO7LX8eh_H3TA_O_Ebc2asJPhDoqImE-3Xz0BmaeM_EucIr0= +$ podman-compose run web bundle rake secret +6ece0cfc0772308479f5cd6155cfc282defab20307a185b399dd6cf2f9b4dc3a81691406c368905c64ccafa56e05473371dccb3b948001369b18be57cfefa9f4 +$ podman-compose run web bundle rake secret +e2fdd51aef896d5c8c647dbbf6b77426d3df59a2817181738afc0ae8ab9e34a413ac5f21ef9aed41f38260075ff6a327f29e717f03c66296dfc0838402851714 +$ cat > .env.production <<'EOF' +# This is a sample configuration file. You can generate your configuration +# with the `rake mastodon:setup` interactive setup wizard, but to customize +# your setup even further, you'll need to edit it manually. This sample does +# not demonstrate all available configuration options. Please look at +# https://docs.joinmastodon.org/admin/config/ for the full documentation. + +# Note that this file accepts slightly different syntax depending on whether +# you are using `docker-compose` or not. In particular, if you use +# `docker-compose`, the value of each declared variable will be taken verbatim, +# including surrounding quotes. +# See: https://github.com/mastodon/mastodon/issues/16895 + +# Federation +# ---------- +# This identifies your server and cannot be changed safely later +# ---------- +LOCAL_DOMAIN=example.com + +# Redis +# ----- +# REDIS_HOST=localhost +REDIS_HOST=redis +REDIS_PORT=6379 + +# PostgreSQL +# ---------- +# DB_HOST=/var/run/postgresql +DB_HOST=db +DB_USER=mastodon +DB_NAME=mastodon_production +DB_PASS=mastodon +DB_PORT=5432 + +# Elasticsearch (optional) +# ------------------------ +# ES_ENABLED=true +# ES_HOST=localhost +# ES_PORT=9200 +# Authentication for ES (optional) +# ES_USER=elastic +# ES_PASS=password + +# Secrets +# ------- +# Make sure to use `podman-compose run web bundle rake secret` to generate secrets +# ------- +SECRET_KEY_BASE=6ece0cfc0772308479f5cd6155cfc282defab20307a185b399dd6cf2f9b4dc3a81691406c368905c64ccafa56e05473371dccb3b948001369b18be57cfefa9f4 +OTP_SECRET=e2fdd51aef896d5c8c647dbbf6b77426d3df59a2817181738afc0ae8ab9e34a413ac5f21ef9aed41f38260075ff6a327f29e717f03c66296dfc0838402851714 + +# Web Push +# -------- +# Generate with `podman-compose run web bundle rake mastodon:webpush:generate_vapid_key` +# -------- +VAPID_PRIVATE_KEY=djDWtpmK3CD9SUu_UedWOyOGBA-Fg5r5MWiXVhZHZbo= +VAPID_PUBLIC_KEY=BOVhs2nJ4MpjdaHAVu7UdlPlNjzMX2pKFyKgOxvYO7LX8eh_H3TA_O_Ebc2asJPhDoqImE-3Xz0BmaeM_EucIr0= + +# Sending mail +# ------------ +# SMTP_SERVER=smtp.mailgun.org +# SMTP_PORT=587 +# SMTP_LOGIN= +# SMTP_PASSWORD= +# SMTP_FROM_ADDRESS=notifications@example.com + +# File storage (optional) +# ----------------------- +# S3_ENABLED=true +# S3_BUCKET=files.example.com +# AWS_ACCESS_KEY_ID= +# AWS_SECRET_ACCESS_KEY= +# S3_ALIAS_HOST=files.example.com + +# IP and session retention +# ----------------------- +# Make sure to modify the scheduling of ip_cleanup_scheduler in config/sidekiq.yml +# to be less than daily if you lower IP_RETENTION_PERIOD below two days (172800). +# ----------------------- +IP_RETENTION_PERIOD=31556952 +SESSION_RETENTION_PERIOD=31556952 +EOF +$ head -n 16 docker-compose.yml +version: '3' +services: + db: + restart: always + image: postgres:14-alpine + shm_size: 256mb + networks: + - internal_network + healthcheck: + test: ['CMD', 'pg_isready', '-U', 'postgres'] + volumes: + - ./postgres14:/var/lib/postgresql/data + environment: + - 'POSTGRES_DB=mastodon_production' + - 'POSTGRES_USER=mastodon' + - 'POSTGRES_PASSWORD=mastodon' +$ podman-compose down +$ sudo rm -rf postgres14/ +$ time podman-compose run web bundle exec rake mastodon:setup +$ podman-compose up +podman start -a mastodon_db_1 +podman start -a mastodon_redis_1 +podman start -a mastodon_web_1 +podman start -a mastodon_streaming_1 +podman start -a mastodon_sidekiq_1 +WARN Starting streaming API server master with 3 workers +=> Booting Puma +=> Rails 6.1.7 application starting in production +=> Run `bin/rails server --help` for more startup options +WARN Starting worker 3 +WARN Starting worker 2 +WARN Worker 3 now listening on 0.0.0.0:4000 +WARN Worker 2 now listening on 0.0.0.0:4000 +WARN Starting worker 1 +WARN Worker 1 now listening on 0.0.0.0:4000 +2022-11-15T05:55:05.712Z pid=2 tid=53y WARN: `config.options[:key] = value` is deprecated, use `config[:key] = value`: ["/opt/mastodon/lib/mastodon/redis_config.rb:38:in `'", "/opt/mastodon/config/application.rb:53:in `require_relative'"] +2022-11-15T05:55:06.117Z pid=2 tid=53y INFO: Booting Sidekiq 6.5.7 with Sidekiq::RedisConnection::RedisAdapter options {:driver=>:hiredis, :url=>"redis://redis:6379/0", :namespace=>nil} +[4] Puma starting in cluster mode... +[4] * Puma version: 5.6.5 (ruby 3.0.4-p208) ("Birdie's Version") +[4] * Min threads: 5 +[4] * Max threads: 5 +[4] * Environment: production +[4] * Master PID: 4 +[4] * Workers: 2 +[4] * Restarts: (✔) hot (✖) phased +[4] * Preloading application +[4] * Listening on http://0.0.0.0:3000 +[4] Use Ctrl-C to stop +[4] - Worker 0 (PID: 10) booted in 0.01s, phase: 0 +[4] - Worker 1 (PID: 11) booted in 0.0s, phase: 0 +2022-11-15 05:55:07.954 UTC [233] FATAL: role "postgres" does not exist +2022-11-15T05:55:09.222Z pid=2 tid=53y INFO: Booted Rails 6.1.7 application in production environment +2022-11-15T05:55:09.222Z pid=2 tid=53y INFO: Running in ruby 3.0.4p208 (2022-04-12 revision 3fa771dded) [x86_64-linux] +2022-11-15T05:55:09.222Z pid=2 tid=53y INFO: See LICENSE and the LGPL-3.0 for licensing details. +2022-11-15T05:55:09.222Z pid=2 tid=53y INFO: Upgrade to Sidekiq Pro for more features and support: https://sidekiq.org +2022-11-15T05:55:09.227Z pid=2 tid=53y INFO: Loading Schedule +2022-11-15T05:55:09.227Z pid=2 tid=53y INFO: Scheduling scheduled_statuses_scheduler {"every"=>"5m", "class"=>"Scheduler::ScheduledStatusesScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.228Z pid=2 tid=53y INFO: Scheduling trends_refresh_scheduler {"every"=>"5m", "class"=>"Scheduler::Trends::RefreshScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.231Z pid=2 tid=53y INFO: Scheduling trends_review_notifications_scheduler {"every"=>"6h", "class"=>"Scheduler::Trends::ReviewNotificationsScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.232Z pid=2 tid=53y INFO: Scheduling indexing_scheduler {"every"=>"5m", "class"=>"Scheduler::IndexingScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.234Z pid=2 tid=53y INFO: Scheduling vacuum_scheduler {"cron"=>"59 5 * * *", "class"=>"Scheduler::VacuumScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.237Z pid=2 tid=53y INFO: Scheduling follow_recommendations_scheduler {"cron"=>"44 8 * * *", "class"=>"Scheduler::FollowRecommendationsScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.239Z pid=2 tid=53y INFO: Scheduling user_cleanup_scheduler {"cron"=>"2 5 * * *", "class"=>"Scheduler::UserCleanupScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.240Z pid=2 tid=53y INFO: Scheduling ip_cleanup_scheduler {"cron"=>"13 4 * * *", "class"=>"Scheduler::IpCleanupScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.242Z pid=2 tid=53y INFO: Scheduling pghero_scheduler {"cron"=>"0 0 * * *", "class"=>"Scheduler::PgheroScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.245Z pid=2 tid=53y INFO: Scheduling instance_refresh_scheduler {"cron"=>"0 * * * *", "class"=>"Scheduler::InstanceRefreshScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.247Z pid=2 tid=53y INFO: Scheduling accounts_statuses_cleanup_scheduler {"interval"=>"1 minute", "class"=>"Scheduler::AccountsStatusesCleanupScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.248Z pid=2 tid=53y INFO: Scheduling suspended_user_cleanup_scheduler {"interval"=>"1 minute", "class"=>"Scheduler::SuspendedUserCleanupScheduler", "queue"=>"scheduler"} +2022-11-15T05:55:09.249Z pid=2 tid=53y INFO: Schedules Loaded +2022-11-15T05:55:09.255Z pid=2 tid=53y uniquejobs=upgrade_locks INFO: Already upgraded to 7.1.27 +2022-11-15T05:55:09.256Z pid=2 tid=53y uniquejobs=reaper INFO: Starting Reaper +2022-11-15T05:55:09.262Z pid=2 tid=2dsy uniquejobs=reaper INFO: Nothing to delete; exiting. +2022-11-15T05:55:09.265Z pid=2 tid=2dsy uniquejobs=reaper INFO: Nothing to delete; exiting. +[09ee11d4-25e1-4330-9f65-b642ae6a3732] Chewy request strategy is `mastodon` +[09ee11d4-25e1-4330-9f65-b642ae6a3732] method=HEAD path=/health format=*/* controller=HealthController action=show status=200 duration=2.07 view=1.45 +2022-11-15 05:55:38.155 UTC [288] FATAL: role "postgres" does not exist +[ActionDispatch::HostAuthorization::DefaultResponseApp] Blocked host: 0.0.0.0 +[ActionDispatch::HostAuthorization::DefaultResponseApp] Blocked host: localhost +ERR! fc8ec631-1ade-4713-a8c8-6125ba6cf87c Error: Access token does not cover required scopes +ERR! 17f0501f-de79-45f8-93cb-e5b8bb7178f7 Error: Access token does not cover required scopes +[40016351-367b-4d43-be62-e2340fde46de] method=HEAD path=/health format=*/* controller=HealthController action=show status=200 duration=0.26 view=0.13 +[ActionDispatch::HostAuthorization::DefaultResponseApp] Blocked host: localhost +2022-11-15 05:56:08.911 UTC [346] FATAL: role "postgres" does not exist +2022-11-15T05:56:09.297Z pid=2 tid=2dv6 INFO: queueing Scheduler::AccountsStatusesCleanupScheduler (accounts_statuses_cleanup_scheduler) +2022-11-15T05:56:09.301Z pid=2 tid=2dvq class=Scheduler::AccountsStatusesCleanupScheduler jid=030c3bd88689321e9097003a INFO: start +2022-11-15T05:56:09.304Z pid=2 tid=2dyi INFO: queueing Scheduler::SuspendedUserCleanupScheduler (suspended_user_cleanup_scheduler) +2022-11-15T05:56:09.306Z pid=2 tid=2dz2 class=Scheduler::SuspendedUserCleanupScheduler jid=03293e9712b7020c368c02bc INFO: start +2022-11-15T05:56:09.341Z pid=2 tid=2dvq class=Scheduler::AccountsStatusesCleanupScheduler jid=030c3bd88689321e9097003a elapsed=0.04 INFO: done +2022-11-15T05:56:09.356Z pid=2 tid=2dz2 class=Scheduler::SuspendedUserCleanupScheduler jid=03293e9712b7020c368c02bc elapsed=0.051 INFO: done +$ curl -v http://localhost:3000/ +* Trying 127.0.0.1:3000... +* Connected to localhost (127.0.0.1) port 3000 (#0) +> GET / HTTP/1.1 +> Host: localhost:3000 +> User-Agent: curl/7.85.0 +> Accept: */* +> +* Mark bundle as not supporting multiuse +< HTTP/1.1 403 Forbidden +< Content-Type: text/html; charset=UTF-8 +< Content-Length: 0 +< +* Connection #0 to host localhost left intact +``` + +- TODO + - [ ] SCITT help make no new HTTP headers, SCITT as DID method? SCITT via ATP probably. Prototype as Data Repository. + - [x] [SCITT API Emulator Bring Up](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4110695) + - [ ] ActivityPub (Mastodon) bring up + - [ ] Spin up and configure Atuin client / server https://github.com/ellie/atuin/blob/main/docs/server.md + - [x] Update `Architecting Alice: She's Arriving When?` to include a start at some content + we'd planned and drafted here and there related to the system context. + - [docs: tutorials: rolling alice: architecting alice: she's arriving when?: Mermaid diagram for pattern with stream of consciousness and SCITT](https://github.com/intel/dffml/commit/fbcbc86b5c52932bccf4cd6321f4e79f60ad3023) + - In this we only implement in memory and serialized SCITT for a + single entity, Alice, no Bob yet. In `Architecting Alice: Stream of Consciousness`, + we implement Alice and Bob comms on top of SBOM, VEX, VDR. + - [ ] Ping https://github.com/ipvm-wg/spec/pull/8/files with She's Arriving When? and + Our Open Source Guide to illustrate dataflow and provenance. + - [ ] Explain how [https://gist.github.com/pdxjohnny/57b049c284e58f51d0a0d35d05d03d4a](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4141183) hopes to illustrate chaining effects. + - [ ] Update `Architecting Alice: Stream of Consciousness` to include notes on + building off of `Architecting Alice: She's Arriving When?` to now communicate + between Alice and Bob via SBOM, VEX, VDR, etc. rolled in. + - First just do simple openssl demo where package is distributed as binary wheel + via static pypi, sbom deployment. + - https://github.com/tpm2-software/tpm2-pytss (this should ldd to openssl) + - Tracking via: https://github.com/intel/dffml/issues/1421 + - https://github.com/CycloneDX/cyclonedx-bom-exchange-api + - [ ] Alice CLI command to start working an issue + - `alice please context switch -overlay log_work_to_github_issue https://github.com/intel/dffml/issues/1421` + - Pretty print issue body + - Start logging work to thread + - [ ] Check later today for movement on https://github.com/decentralized-identity/credential-manifest/issues/125#issuecomment-1310728595 + - [ ] Simple `python -m http.server --cgi ` based implementation of an upload server + - Ideally this updates the directory structure of a static PyPi registry (future: OCI image registry) + - Require SCITT recit with manifest of artifact sha and OIDC token + - We can self issue to start + - [ ] Reach out to devs of https://githubnext.com/projects/ai-for-pull-requests/ about abstraction layer / intermediate representation. + - [ ] Mastodon / ActivityPub as Intentory (see PR) ala meta package repo / stream of consciousness/ release notification’s and metadata (our ATProto precursor) + - [ ] Figure out how to do periodic follow on scanning with CVE-Bin-Tool + - Could just be ensureing there are github actions workflows on schedule to scan + - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0023/reply_0022.md + - > Create first distinct ancestor of Alice by creating ongoing validation flows to re check cves when new vulns come in. Show how tjisbis trival by adding those contexts to the chain whoch are picked ip and executed by agents. Agents just look for any contexts that have been issused but not executed. Prioritizer also prioritizes “reminder threads whoch remind prioritizater to re broadcast train of thought on periodic cycle if not scheduled for execution with frequency based on priority. Agents ckning online need inly look at chain for tasks + - [ ] Put "I'm a sign not a cop" somewhere, seems like there is content to be organized + - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0036/reply_0022.md + - [ ] Find a place for more background on the mental model and perhaps tie in the InnerSource example as how we determine if Alice is working on the right stuff (aligned with her strategic principles) when she is the org, and she's running multiple engagements. (system context? or is that overloaded, probably the tie in with the innersource stuff here becomes it's own tutorial). + - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0036/reply_0062.md + - https://github.com/intel/dffml/issues/1287 + - [ ] Work on teaching Alice to use the shell / capture context https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0003_a_shell_for_a_ghost.md + - Future + - [ ] Reference current content on capturing shell commands and context might be better off in Coach Alice where we want to record analyze and detect failure patterns across sessions / devs so that we can not work down known bad paths. + - Revisit dataflows from bash line analysis tie in with consoletest (that refactor stalled out :grimacing:) + - https://github.com/tmux-python/tmuxp + - [ ] Alice, please summarize meeting notes + - [ ] and send as toot to Mastodon thread + - Context awareness overlays for + - Mastodon + - server + - handle + - password or token \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0086/reply_0001.md b/docs/discussions/alice_engineering_comms/0086/reply_0001.md new file mode 100644 index 0000000000..d642b2b9a1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0086/reply_0001.md @@ -0,0 +1,148 @@ +## 2022-11-14 SCITT Meeting Notes + +- https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#heading=h.214jg0n2xjhp +- From Hannes Tschofenig to Everyone 08:02 AM + - > - IoT device onboarding + > - https://fidoalliance.org/specs/FDO/FIDO-Device-Onboard-PS-v1.1-20220419/FIDO-Device-Onboard-PS-v1.1-20220419.html + > - http://www.openmobilealliance.org/release/LightweightM2M/V1_2-20201110-A/HTML-Version/OMA-TS-LightweightM2M_Core-V1_2-20201110-A.html + > - http://www.openmobilealliance.org/release/LightweightM2M/V1_2-20201110-A/HTML-Version/OMA-TS-LightweightM2M_Transport-V1_2-20201110-A.html +- NED IS HERE TODAY WOOHOO!!! He replied on the mailing list yesterday. John + was stoked about that too. His involvement coming from IETF RATS to align on + terminology is a good thing, since he's engaging in this train of thought. + - See depth of field mapping. +- Neil + - Involved in Inernet to identity conference + - Interested in way tot get firm attestations from people about documents + - Worked at Bell labs and was involved in IETF security area in the 90s +- Some refactoring needed on various docs +- Hanes's use case document used as good example for what we are trying to do + - Need more problem statement before going into solution space. + - Recommendation: Use laymans terms, do not use solution terminology within + use case docs and requirements and architecture and threat model. + - There are some overloaded terms in the architecture terminology. + - Some attestation endorsements (signed statement about the item or asset) + - Some overlay in terms of is it an endorsement or is it something different. + - What is the value add that attestation is already a starting point. + - If the use case was already written to assume the attestation use case. + - 3rd party attestation is an endorsement in RATS + - https://www.rfc-editor.org/rfc/rfc7744 + - Use Cases for Authentication and Authorization in Constrained Environments + - > ``` + > Table of Contents + > + > 1. Introduction ....................................................4 + > 1.1. Terminology ................................................4 + > 2. Use Cases .......................................................5 + > 2.1. Container Monitoring .......................................5 + > 2.1.1. Bananas for Munich ..................................6 + > 2.1.2. Authorization Problems Summary ......................7 + > 2.2. Home Automation ............................................8 + > 2.2.1. Controlling the Smart Home Infrastructure ...........8 + > 2.2.2. Seamless Authorization ..............................8 + > 2.2.3. Remotely Letting in a Visitor .......................9 + > 2.2.4. Selling the House ...................................9 + > 2.2.5. Authorization Problems Summary ......................9 + > 2.3. Personal Health Monitoring ................................10 + > 2.3.1. John and the Heart Rate Monitor ....................11 + > 2.3.2. Authorization Problems Summary .....................12 + > 2.4. Building Automation .......................................13 + > 2.4.1. Device Life Cycle ..................................13 + > 2.4.1.1. Installation and Commissioning ............13 + > 2.4.1.2. Operational ...............................14 + > 2.4.1.3. Maintenance ...............................15 + > 2.4.1.4. Recommissioning ...........................16 + > 2.4.1.5. Decommissioning ...........................16 + > 2.4.2. Public Safety ......................................17 + > 2.4.2.1. A Fire Breaks Out .........................17 + > 2.4.3. Authorization Problems Summary .....................18 + > 2.5. Smart Metering ............................................19 + > 2.5.1. Drive-By Metering ..................................19 + > 2.5.2. Meshed Topology ....................................20 + > 2.5.3. Advanced Metering Infrastructure ...................20 + > 2.5.4. Authorization Problems Summary .....................21 + > 2.6. Sports and Entertainment ..................................22 + > 2.6.1. Dynamically Connecting Smart Sports Equipment ......22 + > 2.6.2. Authorization Problems Summary .....................23 + > 2.7. Industrial Control Systems ................................23 + > 2.7.1. Oil Platform Control ...............................23 + > 2.7.2. Authorization Problems Summary .....................24 + > 3. Security Considerations ........................................24 + > 3.1. Attacks ...................................................25 + > 3.2. Configuration of Access Permissions .......................26 + > 3.3. Authorization Considerations ..............................26 + > 3.4. Proxies ...................................................28 + > 4. Privacy Considerations .........................................28 + > 5. Informative References .........................................28 + > Acknowledgments ...................................................29 + > Authors' Addresses ................................................30 + > ``` +- We need to address Ned's Concern to define what is the clear scope of the difference + between what IETF RATS attestation offers. +- Sylvan + - Concete senario using confideniation conpute + - Using hardware attestiaont reprots abou CCF running in the cloud + - Say you're running a workload you are running it in the cloud + - Covidential containers which covers the VMs, hostdata MRconfig, policy used to say what you can run on that utility VM, it has a hardware attestqation story and follow the RATs spec. + - This can be passed out to anyone to verify and validate the workload is what it was based on measurment + - Now you don't want a precisou hash of mrenclave on TXD, it's fine to run whatever as long as it's signed by a specific issuer, that given endoresements, I might be handed a signature on an image the provider might give a different signed image to someone elese, what SCITT does (verifter policy, UVM hash percides SCITT receipt valation and feed for UVM which is the feed that identifies its purpose, from this parent on this __ of this scitt image, expect the payload in COSE.1 to be the hash that you would find measured from the TPM + - I want to be able to attest container application, webaps, whatever + - Can the attestation not report on that? + - Sylvan has a parcitular view on how you report on the confidential containre and how it attests to the workload + - If we want to talk just about the base VM boot image, how do I make sure my provider can give me version 1.2 without breaking my workload and I get transparency over every workload that can run (policy) and I have recpeits of it happening + - As a verifier you only want to use a service if it's running code I crea bout + - If I trust an abitrary signer, then I can rely on signature alone + - But if I want SCITT, it's because I want auditable evidence then I want a recpeit that is universally verifiable, you can hand it off to customers to prove that I ran in a confidential compute environemnt. + - Ned says why do I need that? I have the precise hashs? + - We have a post hoc auditability gaurintee because it's transparence +- RATS + - Reference value providers + - Verifier's job is to do a mapping between the two + - The endorsement is a claim that is true if the reference value is matched to evidence + - Those cliams might be added to accpted set if there was some Notary specific data + - Verifier has a policy that says I trust this endorser and I don't trust that one. + - SCITT Transparency is one thing we layer on top + - By binding endorsers though audit log we allow for online detection, but most people might do it after the fact (post-hoc, ad-hoc post?) +- Attestation letter will be defined by CISA, this will become an artifact the customer will receive + - How could they do that using RATS? + - An attestation letter in this case sounds more like an endorsement (I'm a good guy, trust me, these are the things that I do) + - SW vendor makes a claim, customer needs to be able to verify the trustworthyness of that claim + - ISO9000, he I'm following this process, are there auditors to make sure you're following it? + - Customers might go to SCITT and say has anyone looked at this thing? Is it trustworthy? + - This is why DFFML cares about SCITT, because of adding data about development lifecycle processes to act as a self audit capability for people to run different static analysis (meta static analysis) + - Is there a blocking process to get on the regisistry? no! (federation, DID based flat file, we can build all this offline and join disparate roots later, this is why we like SCITT) + - Other parties can endorse and make transparnt their endorsements (notary step) + - Registration policy controls what signed statemtnst can be made transparent, it can alos say who can put signed statemtenst in (OIDC) and make them transparent via this instance + - We want to enable additional audutors to audit each other, they make additional statemtnst, sign those statemtnst and make them transpacent via the SCITT log they submit to + - This allows us to go N level and N link on the graph deep in terms of what set we want to define we "trusted" +- SW produces package + - 3rd party produces endorsement about produced package (a 2nd witness of that claim) + - Ned says this is possible with RATS, the thing it doesn't try to define is that you + have to have that, they would call that an "appraisal policy", the you have to have + this second entity (Alice? ;) doing the tests. + - SCITT is saying those claims have to be published somewhere (even self with Alice + offline case). + - What value do those additional witnesses bring? + - Existance of a recpeit is proof that signed claims were made and made in a + specific order, they are tamperproof (rather than just tampter evident). + - With transpanecy I can accept an update, and know I can check later, + if they lie, I can go find out that they lied. +- TODO + - [ ] Section on Federation (8) + - [SCITT API Emulator Bring Up](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4110695) + - We upload `alice shouldi contribute` dataflow to SCITT and get a receipt! + - Friends, today is a great day. :railway_track: + - Next stop, serialization / federation with Alice / Open Architecture serialization data flow as SCITT service. + - Started with mermaid added in https://github.com/intel/dffml/commit/fbcbc86b5c52932bccf4cd6321f4e79f60ad3023 to https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md + - [ ] Use case documents + - [ ] OpenSSF Metrics + - Use Microsoft SCITT API Emulator (MIT) as upstream / reference + implementation. Talk about how to used the data provenance on the workflow + (`alice shouldi contribute`). + - We can then start doing the `did:merkle:` what do I care about itermediate + representation to do cross platform (jenkins, github actions, etc.) caching + / analysis of caching / please contributed streamlined. + - Play with OIDC and SCITT + - Later show overlayed flow on top of upstream (OpenSSF metrics or something + ideally would be the upstream defining these flows, probably, in most cases). + - Need to patch dataflows to include `upstream` as flows / system context + it came from if overlayed. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0086/reply_0002.md b/docs/discussions/alice_engineering_comms/0086/reply_0002.md new file mode 100644 index 0000000000..5faddbad7f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0086/reply_0002.md @@ -0,0 +1,1130 @@ +# Alice, should I contribute? Data Flow + +Cross post: https://gist.github.com/pdxjohnny/57b049c284e58f51d0a0d35d05d03d4a +Cross post: https://github.com/intel/dffml/discussions/1382#discussioncomment-4141177 +Cross post: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4141183 +Upstream: https://github.com/intel/dffml/tree/8847989eb4cc9f6aa484285ba9c11ff920113ed3 + +```console +$ export TITLE="Alice, should I contribute? Data Flow (upstream: https://github.com/intel/dffml/tree/8847989eb4cc9f6aa484285ba9c11ff920113ed3)"; +$ (echo "${TITLE}" \ + && echo \ + && python -um dffml service dev export alice.cli:ALICE_COLLECTOR_DATAFLOW > alice_shouldi_contribute.json \ + && echo '```mermaid' \ + && python -um dffml dataflow diagram -stage processing -configloader json alice_shouldi_contribute.json \ + && echo '```' \ + && echo \ + && echo '```yaml' \ + && python -c "import sys, pathlib, json, yaml; print(yaml.dump(json.load(sys.stdin)))" < alice_shouldi_contribute.json \ + && echo '```' \ + && echo) \ + | gh gist create --public --desc "${TITLE}" -f ALICE_SHOULDI_CONTRIBUTE_THREATS.md - +``` + +```mermaid +graph TD +subgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL] +style d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71 +f577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL] +7440e73a8e8f864097f42162b74f2762(URL) +7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40 +8e39b501b41c5d0e4596318f80a03210(valid) +f577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210 +end +subgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo] +style af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71 +155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo] +eed77b9eea541e0c378c67395351099c(URL) +eed77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5 +8b5928cd265dd2c44d67d076f60c8b05(ssh_key) +8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5 +4e1d5ea96e050e46ebf95ebc0713d54c(repo) +155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c +6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL} +6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5 +end +subgraph d367039fa2c485f55058105e7e0c0b6b[count_authors] +style d367039fa2c485f55058105e7e0c0b6b fill:#fff4de,stroke:#cece71 +70c47962ba601f0df1890f4c72ae1b54[count_authors] +0637dcbe07cd05b96d0a6a2dfbb0c5ff(author_lines) +0637dcbe07cd05b96d0a6a2dfbb0c5ff --> 70c47962ba601f0df1890f4c72ae1b54 +e1d1567e6b3a3e5d899b9543c693a66f(authors) +70c47962ba601f0df1890f4c72ae1b54 --> e1d1567e6b3a3e5d899b9543c693a66f +end +subgraph 7c3ab755010b5134c7c3c5be9fed1f1c[dffml_feature_git.feature.operations:git_grep] +style 7c3ab755010b5134c7c3c5be9fed1f1c fill:#fff4de,stroke:#cece71 +7155c0a875a889898d6d6e0c7959649b[dffml_feature_git.feature.operations:git_grep] +1fc5390b128a11a95280a89ad371a5ae(repo) +1fc5390b128a11a95280a89ad371a5ae --> 7155c0a875a889898d6d6e0c7959649b +cc134251a8bdd1d0944ea69eafc239a4(search) +cc134251a8bdd1d0944ea69eafc239a4 --> 7155c0a875a889898d6d6e0c7959649b +8b7a73c5b4f92ff7fb362de5d8e90b3e(found) +7155c0a875a889898d6d6e0c7959649b --> 8b7a73c5b4f92ff7fb362de5d8e90b3e +end +subgraph 2863a5f2869f0187864ff7a8afcbc2f5[dffml_operations_innersource.cli:ensure_tokei] +style 2863a5f2869f0187864ff7a8afcbc2f5 fill:#fff4de,stroke:#cece71 +a7fe94e6e97c131edebbf73cca7b8852[dffml_operations_innersource.cli:ensure_tokei] +3f6fe14c9392820b8562f809c7e2b8b4(result) +a7fe94e6e97c131edebbf73cca7b8852 --> 3f6fe14c9392820b8562f809c7e2b8b4 +end +subgraph 1f8d333356c8981dfc553c7eb00bf366[dffml_operations_innersource.cli:github_repo_id_to_clone_url] +style 1f8d333356c8981dfc553c7eb00bf366 fill:#fff4de,stroke:#cece71 +859feff15e5487fdad83ec4c42c506e7[dffml_operations_innersource.cli:github_repo_id_to_clone_url] +d2bc011260868bff46d1a206c404a549(repo_id) +d2bc011260868bff46d1a206c404a549 --> 859feff15e5487fdad83ec4c42c506e7 +1f6ba749c4b65c55218b968bf308e4e2(result) +859feff15e5487fdad83ec4c42c506e7 --> 1f6ba749c4b65c55218b968bf308e4e2 +end +subgraph f2b87480bbba5729364d76ad2fd5ef17[dffml_operations_innersource.operations:action_yml_files] +style f2b87480bbba5729364d76ad2fd5ef17 fill:#fff4de,stroke:#cece71 +4de0ba6484f92eba7073404d21fb3598[dffml_operations_innersource.operations:action_yml_files] +847cd99cca177936d533aaa4918c6699(repo) +847cd99cca177936d533aaa4918c6699 --> 4de0ba6484f92eba7073404d21fb3598 +7fa0f9133dfd9f00a90383b38c2ec840(result) +4de0ba6484f92eba7073404d21fb3598 --> 7fa0f9133dfd9f00a90383b38c2ec840 +end +subgraph 98179e1c9444a758d9565431f371b232[dffml_operations_innersource.operations:code_of_conduct_present] +style 98179e1c9444a758d9565431f371b232 fill:#fff4de,stroke:#cece71 +fb772128fdc785ce816c73128e0afd4d[dffml_operations_innersource.operations:code_of_conduct_present] +f333b126c62bdbf832dddf105278d218(repo) +f333b126c62bdbf832dddf105278d218 --> fb772128fdc785ce816c73128e0afd4d +1233aac886e50641252dcad2124003c9(result) +fb772128fdc785ce816c73128e0afd4d --> 1233aac886e50641252dcad2124003c9 +end +subgraph d03657cbeff4a7501071526c5227d605[dffml_operations_innersource.operations:contributing_present] +style d03657cbeff4a7501071526c5227d605 fill:#fff4de,stroke:#cece71 +8da2c8a3eddf27e38838c8b6a2cd4ad1[dffml_operations_innersource.operations:contributing_present] +2a1ae8bcc9add3c42e071d0557e98b1c(repo) +2a1ae8bcc9add3c42e071d0557e98b1c --> 8da2c8a3eddf27e38838c8b6a2cd4ad1 +52544c54f59ff4838d42ba3472b02589(result) +8da2c8a3eddf27e38838c8b6a2cd4ad1 --> 52544c54f59ff4838d42ba3472b02589 +end +subgraph 3ac62bbb02d944121299b756fc806782[dffml_operations_innersource.operations:get_current_datetime_as_git_date] +style 3ac62bbb02d944121299b756fc806782 fill:#fff4de,stroke:#cece71 +913421183cb3f7803fb82a12e4ee711f[dffml_operations_innersource.operations:get_current_datetime_as_git_date] +e17cbcbbf2d11ed5ce43603779758076(result) +913421183cb3f7803fb82a12e4ee711f --> e17cbcbbf2d11ed5ce43603779758076 +end +subgraph 5827679f9c689590302b3f46277551ec[dffml_operations_innersource.operations:github_workflows] +style 5827679f9c689590302b3f46277551ec fill:#fff4de,stroke:#cece71 +160833350a633bb60ee3880fb824189e[dffml_operations_innersource.operations:github_workflows] +caaae91348f7c892daa1d05fbd221352(repo) +caaae91348f7c892daa1d05fbd221352 --> 160833350a633bb60ee3880fb824189e +882be05f5b4ede0846177f68fc70cfd4(result) +160833350a633bb60ee3880fb824189e --> 882be05f5b4ede0846177f68fc70cfd4 +end +subgraph f1a14368132c9536201d6260d7fc6b63[dffml_operations_innersource.operations:groovy_files] +style f1a14368132c9536201d6260d7fc6b63 fill:#fff4de,stroke:#cece71 +d86d2384b02c75979f3a21818187764e[dffml_operations_innersource.operations:groovy_files] +37b63c13bc63cddeaba57cee5dc3f613(repo) +37b63c13bc63cddeaba57cee5dc3f613 --> d86d2384b02c75979f3a21818187764e +6e31b041bad7c24fa5b0a793ff20890b(result) +d86d2384b02c75979f3a21818187764e --> 6e31b041bad7c24fa5b0a793ff20890b +end +subgraph 49272b4d054d834d0dfd08d62360a489[dffml_operations_innersource.operations:jenkinsfiles] +style 49272b4d054d834d0dfd08d62360a489 fill:#fff4de,stroke:#cece71 +a31545bdef7e66159d0b56861e4a4fa3[dffml_operations_innersource.operations:jenkinsfiles] +449ec8a512ad1a002c5bbbd0fc8294e9(repo) +449ec8a512ad1a002c5bbbd0fc8294e9 --> a31545bdef7e66159d0b56861e4a4fa3 +4963673c5f8ef045573769c58fc54a77(result) +a31545bdef7e66159d0b56861e4a4fa3 --> 4963673c5f8ef045573769c58fc54a77 +end +subgraph 3ab6f933ff2c5d1c31f5acce50ace507[dffml_operations_innersource.operations:readme_present] +style 3ab6f933ff2c5d1c31f5acce50ace507 fill:#fff4de,stroke:#cece71 +ae6634d141e4d989b0f53fd3b849b101[dffml_operations_innersource.operations:readme_present] +4d289d268d52d6fb5795893363300585(repo) +4d289d268d52d6fb5795893363300585 --> ae6634d141e4d989b0f53fd3b849b101 +65fd35d17d8a7e96c9f7e6aaedb75e3c(result) +ae6634d141e4d989b0f53fd3b849b101 --> 65fd35d17d8a7e96c9f7e6aaedb75e3c +end +subgraph da39b149b9fed20f273450b47a0b65f4[dffml_operations_innersource.operations:security_present] +style da39b149b9fed20f273450b47a0b65f4 fill:#fff4de,stroke:#cece71 +c8921544f4665e73080cb487aef7de94[dffml_operations_innersource.operations:security_present] +e682bbcfad20caaab15e4220c81e9239(repo) +e682bbcfad20caaab15e4220c81e9239 --> c8921544f4665e73080cb487aef7de94 +5d69c4e5b3601abbd692ade806dcdf5f(result) +c8921544f4665e73080cb487aef7de94 --> 5d69c4e5b3601abbd692ade806dcdf5f +end +subgraph 062b8882104862540d584516edc60008[dffml_operations_innersource.operations:support_present] +style 062b8882104862540d584516edc60008 fill:#fff4de,stroke:#cece71 +5cc75c20aee40e815abf96726508b66d[dffml_operations_innersource.operations:support_present] +f0e4cd91ca4f6b278478180a188a2f5f(repo) +f0e4cd91ca4f6b278478180a188a2f5f --> 5cc75c20aee40e815abf96726508b66d +46bd597a57e034f669df18ac9ae0a153(result) +5cc75c20aee40e815abf96726508b66d --> 46bd597a57e034f669df18ac9ae0a153 +end +subgraph 208d072a660149b8e7b7e55de1b6d4dd[git_commits] +style 208d072a660149b8e7b7e55de1b6d4dd fill:#fff4de,stroke:#cece71 +90b953c5527ed3a579912eea8b02b1be[git_commits] +e0d40a3d87e4946fdf517eaa40848e39(branch) +e0d40a3d87e4946fdf517eaa40848e39 --> 90b953c5527ed3a579912eea8b02b1be +44051d3d0587f293a2f36fb2fca3986e(repo) +44051d3d0587f293a2f36fb2fca3986e --> 90b953c5527ed3a579912eea8b02b1be +80b9ea20367299aca462989eb0356ccf(start_end) +80b9ea20367299aca462989eb0356ccf --> 90b953c5527ed3a579912eea8b02b1be +f75e51a2fca4258c207b5473f62e53e0(commits) +90b953c5527ed3a579912eea8b02b1be --> f75e51a2fca4258c207b5473f62e53e0 +end +subgraph a6fadf4f2f5031106e26cfc42fa08fcd[git_repo_author_lines_for_dates] +style a6fadf4f2f5031106e26cfc42fa08fcd fill:#fff4de,stroke:#cece71 +0afa2b3dbc72afa67170525d1d7532d7[git_repo_author_lines_for_dates] +3396a58cd186eda4908308395f2421c4(branch) +3396a58cd186eda4908308395f2421c4 --> 0afa2b3dbc72afa67170525d1d7532d7 +5ca6153629c6af49e61eb6d5c95c64f2(repo) +5ca6153629c6af49e61eb6d5c95c64f2 --> 0afa2b3dbc72afa67170525d1d7532d7 +fef3455ecf4fc7a993cb14c43d4d345f(start_end) +fef3455ecf4fc7a993cb14c43d4d345f --> 0afa2b3dbc72afa67170525d1d7532d7 +3bf05667f7df95bb2ae3b614ea998cff(author_lines) +0afa2b3dbc72afa67170525d1d7532d7 --> 3bf05667f7df95bb2ae3b614ea998cff +end +subgraph 2a6fb4d7ae016ca95fcfc061d3d1b8ab[git_repo_checkout] +style 2a6fb4d7ae016ca95fcfc061d3d1b8ab fill:#fff4de,stroke:#cece71 +02de40331374616f64ba4a92fbb33edd[git_repo_checkout] +2b82220f7c12c2e39d2dd6330ec875bd(commit) +2b82220f7c12c2e39d2dd6330ec875bd --> 02de40331374616f64ba4a92fbb33edd +95dc6c133455588bd30b1116c857b624(repo) +95dc6c133455588bd30b1116c857b624 --> 02de40331374616f64ba4a92fbb33edd +c762e289fa4f1cd4c4d96b57422f2a81(repo) +02de40331374616f64ba4a92fbb33edd --> c762e289fa4f1cd4c4d96b57422f2a81 +end +subgraph d9401f19394958bb1ad2dd4dfc37fa79[git_repo_commit_from_date] +style d9401f19394958bb1ad2dd4dfc37fa79 fill:#fff4de,stroke:#cece71 +7bbb97768b34f207c34c1f4721708675[git_repo_commit_from_date] +ba10b1d34771f904ff181cb361864ab2(branch) +ba10b1d34771f904ff181cb361864ab2 --> 7bbb97768b34f207c34c1f4721708675 +13e4349f6f7f4c9f65ae38767fab1bd5(date) +13e4349f6f7f4c9f65ae38767fab1bd5 --> 7bbb97768b34f207c34c1f4721708675 +0c19b6fe88747ef09defde05a60e8d84(repo) +0c19b6fe88747ef09defde05a60e8d84 --> 7bbb97768b34f207c34c1f4721708675 +4941586112b4011d0c72c6264b816db4(commit) +7bbb97768b34f207c34c1f4721708675 --> 4941586112b4011d0c72c6264b816db4 +end +subgraph d3d91578caf34c0ae944b17853783406[git_repo_default_branch] +style d3d91578caf34c0ae944b17853783406 fill:#fff4de,stroke:#cece71 +546062a96122df465d2631f31df4e9e3[git_repo_default_branch] +181f1b33df4d795fbad2911ec7087e86(repo) +181f1b33df4d795fbad2911ec7087e86 --> 546062a96122df465d2631f31df4e9e3 +57651c1bcd24b794dfc8d1794ab556d5(branch) +546062a96122df465d2631f31df4e9e3 --> 57651c1bcd24b794dfc8d1794ab556d5 +5ed1ab77e726d7efdcc41e9e2f8039c6(remote) +546062a96122df465d2631f31df4e9e3 --> 5ed1ab77e726d7efdcc41e9e2f8039c6 +4c3cdd5f15b7a846d291aac089e8a622{no_git_branch_given} +4c3cdd5f15b7a846d291aac089e8a622 --> 546062a96122df465d2631f31df4e9e3 +end +subgraph f9155f693f3d5c1dd132e4f9e32175b8[git_repo_release] +style f9155f693f3d5c1dd132e4f9e32175b8 fill:#fff4de,stroke:#cece71 +f01273bde2638114cff25a747963223e[git_repo_release] +a5df26b9f1fb4360aac38ee7ad6c5041(branch) +a5df26b9f1fb4360aac38ee7ad6c5041 --> f01273bde2638114cff25a747963223e +84255574141c7ee6735c88c70cb4dc35(repo) +84255574141c7ee6735c88c70cb4dc35 --> f01273bde2638114cff25a747963223e +b2e4d6aa4a5bfba38584dc028dfc35b8(start_end) +b2e4d6aa4a5bfba38584dc028dfc35b8 --> f01273bde2638114cff25a747963223e +2cd7c2339d5e783198a219f02af0240a(present) +f01273bde2638114cff25a747963223e --> 2cd7c2339d5e783198a219f02af0240a +end +subgraph b121cc70dccc771127b429709d55d6d5[lines_of_code_by_language] +style b121cc70dccc771127b429709d55d6d5 fill:#fff4de,stroke:#cece71 +ef6d613ca7855a13865933156c79ddea[lines_of_code_by_language] +0b781c240b2945323081606938fdf136(repo) +0b781c240b2945323081606938fdf136 --> ef6d613ca7855a13865933156c79ddea +e51defd3debc1237bf64e6ae611595f7(lines_by_language) +ef6d613ca7855a13865933156c79ddea --> e51defd3debc1237bf64e6ae611595f7 +f5eb786f700f1aefd37023db219961a1{str} +f5eb786f700f1aefd37023db219961a1 --> ef6d613ca7855a13865933156c79ddea +end +subgraph 35551a739c7d12be0fed88e1d92a296c[lines_of_code_to_comments] +style 35551a739c7d12be0fed88e1d92a296c fill:#fff4de,stroke:#cece71 +b6e1f853d077365deddea22b2fdb890d[lines_of_code_to_comments] +669759049f3ac6927280566ef45cf980(langs) +669759049f3ac6927280566ef45cf980 --> b6e1f853d077365deddea22b2fdb890d +850cdec03e4988f119a67899cbc5f311(code_to_comment_ratio) +b6e1f853d077365deddea22b2fdb890d --> 850cdec03e4988f119a67899cbc5f311 +end +subgraph 00b5efb50d0353b48966d833eabb1757[make_quarters] +style 00b5efb50d0353b48966d833eabb1757 fill:#fff4de,stroke:#cece71 +7f20bd2c94ecbd47ab6bd88673c7174f[make_quarters] +89dd142dfced4933070ebf4ffaff2630(number) +89dd142dfced4933070ebf4ffaff2630 --> 7f20bd2c94ecbd47ab6bd88673c7174f +224e033ecd73401fc95efaa7d7fa799b(quarters) +7f20bd2c94ecbd47ab6bd88673c7174f --> 224e033ecd73401fc95efaa7d7fa799b +end +subgraph 87b1836daeb62eee5488373bd36b0c48[quarters_back_to_date] +style 87b1836daeb62eee5488373bd36b0c48 fill:#fff4de,stroke:#cece71 +9dc9f9feff38d8f5dd9388d3a60e74c0[quarters_back_to_date] +00bf6f65f7fa0d1ffce8e87585fae1b5(date) +00bf6f65f7fa0d1ffce8e87585fae1b5 --> 9dc9f9feff38d8f5dd9388d3a60e74c0 +8a2fb544746a0e8f0a8984210e6741dc(number) +8a2fb544746a0e8f0a8984210e6741dc --> 9dc9f9feff38d8f5dd9388d3a60e74c0 +cf114d5eea4795cef497592d0632bad7(date) +9dc9f9feff38d8f5dd9388d3a60e74c0 --> cf114d5eea4795cef497592d0632bad7 +9848c2c8981da29ca1cbce32c1a4e457(start_end) +9dc9f9feff38d8f5dd9388d3a60e74c0 --> 9848c2c8981da29ca1cbce32c1a4e457 +end +subgraph 6d61616898ab2c6024fd2a04faba8e02[work] +style 6d61616898ab2c6024fd2a04faba8e02 fill:#fff4de,stroke:#cece71 +67e92c8765a9bc7fb2d335c459de9eb5[work] +91794b0e2b5307720bed41f22724c339(author_lines) +91794b0e2b5307720bed41f22724c339 --> 67e92c8765a9bc7fb2d335c459de9eb5 +8fd602a64430dd860b0a280217d8ccef(work) +67e92c8765a9bc7fb2d335c459de9eb5 --> 8fd602a64430dd860b0a280217d8ccef +end +1f6ba749c4b65c55218b968bf308e4e2 --> 7440e73a8e8f864097f42162b74f2762 +7ec43cbbf66e6d893180645d5e929bb4(seed
URL) +style 7ec43cbbf66e6d893180645d5e929bb4 fill:#f6dbf9,stroke:#a178ca +7ec43cbbf66e6d893180645d5e929bb4 --> 7440e73a8e8f864097f42162b74f2762 +1f6ba749c4b65c55218b968bf308e4e2 --> eed77b9eea541e0c378c67395351099c +7ec43cbbf66e6d893180645d5e929bb4(seed
URL) +style 7ec43cbbf66e6d893180645d5e929bb4 fill:#f6dbf9,stroke:#a178ca +7ec43cbbf66e6d893180645d5e929bb4 --> eed77b9eea541e0c378c67395351099c +a6ed501edbf561fda49a0a0a3ca310f0(seed
git_repo_ssh_key) +style a6ed501edbf561fda49a0a0a3ca310f0 fill:#f6dbf9,stroke:#a178ca +a6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05 +8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce +3bf05667f7df95bb2ae3b614ea998cff --> 0637dcbe07cd05b96d0a6a2dfbb0c5ff +4e1d5ea96e050e46ebf95ebc0713d54c --> 1fc5390b128a11a95280a89ad371a5ae +0690fdb25283b1e0a09016a28aa08c08(seed
git_grep_search) +style 0690fdb25283b1e0a09016a28aa08c08 fill:#f6dbf9,stroke:#a178ca +0690fdb25283b1e0a09016a28aa08c08 --> cc134251a8bdd1d0944ea69eafc239a4 +090b151d70cc5b37562b42c64cb16bb0(seed
GitHubRepoID) +style 090b151d70cc5b37562b42c64cb16bb0 fill:#f6dbf9,stroke:#a178ca +090b151d70cc5b37562b42c64cb16bb0 --> d2bc011260868bff46d1a206c404a549 +c762e289fa4f1cd4c4d96b57422f2a81 --> 847cd99cca177936d533aaa4918c6699 +c762e289fa4f1cd4c4d96b57422f2a81 --> f333b126c62bdbf832dddf105278d218 +c762e289fa4f1cd4c4d96b57422f2a81 --> 2a1ae8bcc9add3c42e071d0557e98b1c +c762e289fa4f1cd4c4d96b57422f2a81 --> caaae91348f7c892daa1d05fbd221352 +c762e289fa4f1cd4c4d96b57422f2a81 --> 37b63c13bc63cddeaba57cee5dc3f613 +c762e289fa4f1cd4c4d96b57422f2a81 --> 449ec8a512ad1a002c5bbbd0fc8294e9 +c762e289fa4f1cd4c4d96b57422f2a81 --> 4d289d268d52d6fb5795893363300585 +c762e289fa4f1cd4c4d96b57422f2a81 --> e682bbcfad20caaab15e4220c81e9239 +c762e289fa4f1cd4c4d96b57422f2a81 --> f0e4cd91ca4f6b278478180a188a2f5f +57651c1bcd24b794dfc8d1794ab556d5 --> e0d40a3d87e4946fdf517eaa40848e39 +4e1d5ea96e050e46ebf95ebc0713d54c --> 44051d3d0587f293a2f36fb2fca3986e +9848c2c8981da29ca1cbce32c1a4e457 --> 80b9ea20367299aca462989eb0356ccf +57651c1bcd24b794dfc8d1794ab556d5 --> 3396a58cd186eda4908308395f2421c4 +4e1d5ea96e050e46ebf95ebc0713d54c --> 5ca6153629c6af49e61eb6d5c95c64f2 +9848c2c8981da29ca1cbce32c1a4e457 --> fef3455ecf4fc7a993cb14c43d4d345f +4941586112b4011d0c72c6264b816db4 --> 2b82220f7c12c2e39d2dd6330ec875bd +4e1d5ea96e050e46ebf95ebc0713d54c --> 95dc6c133455588bd30b1116c857b624 +57651c1bcd24b794dfc8d1794ab556d5 --> ba10b1d34771f904ff181cb361864ab2 +cf114d5eea4795cef497592d0632bad7 --> 13e4349f6f7f4c9f65ae38767fab1bd5 +4e1d5ea96e050e46ebf95ebc0713d54c --> 0c19b6fe88747ef09defde05a60e8d84 +4e1d5ea96e050e46ebf95ebc0713d54c --> 181f1b33df4d795fbad2911ec7087e86 +2334372b57604cd06ceaf611e1c4a458(no_git_branch_given) +2334372b57604cd06ceaf611e1c4a458 --> 4c3cdd5f15b7a846d291aac089e8a622 +57651c1bcd24b794dfc8d1794ab556d5 --> a5df26b9f1fb4360aac38ee7ad6c5041 +4e1d5ea96e050e46ebf95ebc0713d54c --> 84255574141c7ee6735c88c70cb4dc35 +9848c2c8981da29ca1cbce32c1a4e457 --> b2e4d6aa4a5bfba38584dc028dfc35b8 +c762e289fa4f1cd4c4d96b57422f2a81 --> 0b781c240b2945323081606938fdf136 +3c4eda0137cefa5452a87052978523ce --> f5eb786f700f1aefd37023db219961a1 +176c8001e30dae223370012eeb537711 --> f5eb786f700f1aefd37023db219961a1 +3f6fe14c9392820b8562f809c7e2b8b4 --> f5eb786f700f1aefd37023db219961a1 +e51defd3debc1237bf64e6ae611595f7 --> 669759049f3ac6927280566ef45cf980 +a8b3d979c7c66aeb3b753408c3da0976(seed
quarters) +style a8b3d979c7c66aeb3b753408c3da0976 fill:#f6dbf9,stroke:#a178ca +a8b3d979c7c66aeb3b753408c3da0976 --> 89dd142dfced4933070ebf4ffaff2630 +e17cbcbbf2d11ed5ce43603779758076 --> 00bf6f65f7fa0d1ffce8e87585fae1b5 +224e033ecd73401fc95efaa7d7fa799b --> 8a2fb544746a0e8f0a8984210e6741dc +3bf05667f7df95bb2ae3b614ea998cff --> 91794b0e2b5307720bed41f22724c339 +``` + +
+Full dataflow + +```yaml +configs: + dffml_operations_innersource.cli:ensure_tokei: + cache_dir: .tools/open-architecture/innersource/.cache/tokei + platform_urls: + Darwin: + expected_hash: 8c8a1d8d8dd4d8bef93dabf5d2f6e27023777f8553393e269765d7ece85e68837cba4374a2615d83f071dfae22ba40e2 + url: https://github.com/XAMPPRocky/tokei/releases/download/v10.1.1/tokei-v10.1.1-x86_64-apple-darwin.tar.gz + Linux: + expected_hash: 22699e16e71f07ff805805d26ee86ecb9b1052d7879350f7eb9ed87beb0e6b84fbb512963d01b75cec8e80532e4ea29a + url: https://github.com/XAMPPRocky/tokei/releases/download/v10.1.1/tokei-v10.1.1-x86_64-unknown-linux-gnu.tar.gz +definitions: + ActionYAMLFileWorkflowUnixStylePath: + links: + - - - name + - str + - - primitive + - str + name: ActionYAMLFileWorkflowUnixStylePath + primitive: str + CICDLibrary: + links: + - - - name + - dict + - - primitive + - map + name: CICDLibrary + primitive: dict + FileCodeOfConductPresent: + links: + - - - name + - bool + - - primitive + - bool + name: FileCodeOfConductPresent + primitive: bool + FileContributingPresent: + links: + - - - name + - bool + - - primitive + - bool + name: FileContributingPresent + primitive: bool + FileReadmePresent: + links: + - - - name + - bool + - - primitive + - bool + name: FileReadmePresent + primitive: bool + FileSecurityPresent: + links: + - - - name + - bool + - - primitive + - bool + name: FileSecurityPresent + primitive: bool + FileSupportPresent: + links: + - - - name + - bool + - - primitive + - bool + name: FileSupportPresent + primitive: bool + GitHubActionsWorkflowUnixStylePath: + links: + - - - name + - str + - - primitive + - str + name: GitHubActionsWorkflowUnixStylePath + primitive: str + GitHubRepoID: + links: + - - - name + - str + - - primitive + - str + name: GitHubRepoID + primitive: str + GroovyFileWorkflowUnixStylePath: + links: + - - - name + - str + - - primitive + - str + name: GroovyFileWorkflowUnixStylePath + primitive: str + IsCICDGitHubActionsLibrary: + links: + - - - name + - bool + - - primitive + - bool + name: IsCICDGitHubActionsLibrary + primitive: bool + IsCICDJenkinsLibrary: + links: + - - - name + - bool + - - primitive + - bool + name: IsCICDJenkinsLibrary + primitive: bool + JenkinsfileWorkflowUnixStylePath: + links: + - - - name + - str + - - primitive + - str + name: JenkinsfileWorkflowUnixStylePath + primitive: str + URL: + links: + - - - name + - str + - - primitive + - str + name: URL + primitive: str + author_count: + name: author_count + primitive: int + author_line_count: + name: author_line_count + primitive: Dict[str, int] + bool: + name: bool + primitive: bool + commit_count: + name: commit_count + primitive: int + date: + name: date + primitive: string + date_pair: + name: date_pair + primitive: List[date] + git_branch: + links: + - - - name + - str + - - primitive + - str + name: git_branch + primitive: str + git_commit: + name: git_commit + primitive: string + git_grep_found: + name: git_grep_found + primitive: string + git_grep_search: + name: git_grep_search + primitive: string + git_remote: + links: + - - - name + - str + - - primitive + - str + name: git_remote + primitive: str + git_repo_ssh_key: + default: null + name: git_repo_ssh_key + primitive: string + git_repository: + lock: true + name: git_repository + primitive: Dict[str, str] + spec: + defaults: + URL: null + name: GitRepoSpec + types: + URL: str + directory: str + subspec: false + git_repository_checked_out: + lock: true + name: git_repository_checked_out + primitive: Dict[str, str] + spec: + defaults: + URL: null + commit: null + name: GitRepoCheckedOutSpec + types: + URL: str + commit: str + directory: str + subspec: false + group_by_output: + name: group_by_output + primitive: Dict[str, List[Any]] + group_by_spec: + name: group_by_spec + primitive: Dict[str, Any] + language_to_comment_ratio: + name: language_to_comment_ratio + primitive: int + lines_by_language_count: + name: lines_by_language_count + primitive: Dict[str, Dict[str, int]] + no_git_branch_given: + name: no_git_branch_given + primitive: boolean + quarter: + name: quarter + primitive: int + quarter_start_date: + name: quarter_start_date + primitive: int + quarters: + name: quarters + primitive: int + release_within_period: + name: release_within_period + primitive: bool + str: + name: str + primitive: str + valid_git_repository_URL: + name: valid_git_repository_URL + primitive: boolean + work_spread: + name: work_spread + primitive: int +flow: + alice.shouldi.contribute.cicd:cicd_action_library: + inputs: + action_file_paths: + - dffml_operations_innersource.operations:action_yml_files: result + alice.shouldi.contribute.cicd:cicd_jenkins_library: + inputs: + groovy_file_paths: + - dffml_operations_innersource.operations:groovy_files: result + alice.shouldi.contribute.cicd:cicd_library: + inputs: + cicd_action_library: + - alice.shouldi.contribute.cicd:cicd_action_library: result + cicd_jenkins_library: + - alice.shouldi.contribute.cicd:cicd_jenkins_library: result + check_if_valid_git_repository_URL: + inputs: + URL: + - dffml_operations_innersource.cli:github_repo_id_to_clone_url: result + - seed + cleanup_git_repo: + inputs: + repo: + - clone_git_repo: repo + clone_git_repo: + conditions: + - check_if_valid_git_repository_URL: valid + inputs: + URL: + - dffml_operations_innersource.cli:github_repo_id_to_clone_url: result + - seed + ssh_key: + - seed + count_authors: + inputs: + author_lines: + - git_repo_author_lines_for_dates: author_lines + dffml_feature_git.feature.operations:git_grep: + inputs: + repo: + - clone_git_repo: repo + search: + - seed + dffml_operations_innersource.cli:ensure_tokei: + inputs: {} + dffml_operations_innersource.cli:github_repo_id_to_clone_url: + inputs: + repo_id: + - seed + dffml_operations_innersource.operations:action_yml_files: + inputs: + repo: + - git_repo_checkout: repo + dffml_operations_innersource.operations:badge_maintained: + conditions: + - dffml_operations_innersource.operations:maintained: result + - dffml_operations_innersource.operations:unmaintained: result + inputs: {} + dffml_operations_innersource.operations:badge_unmaintained: + conditions: + - dffml_operations_innersource.operations:maintained: result + - dffml_operations_innersource.operations:unmaintained: result + inputs: {} + dffml_operations_innersource.operations:code_of_conduct_present: + inputs: + repo: + - git_repo_checkout: repo + dffml_operations_innersource.operations:contributing_present: + inputs: + repo: + - git_repo_checkout: repo + dffml_operations_innersource.operations:get_current_datetime_as_git_date: + inputs: {} + dffml_operations_innersource.operations:github_workflows: + inputs: + repo: + - git_repo_checkout: repo + dffml_operations_innersource.operations:groovy_files: + inputs: + repo: + - git_repo_checkout: repo + dffml_operations_innersource.operations:jenkinsfiles: + inputs: + repo: + - git_repo_checkout: repo + dffml_operations_innersource.operations:maintained: + inputs: + results: + - group_by: output + dffml_operations_innersource.operations:readme_present: + inputs: + repo: + - git_repo_checkout: repo + dffml_operations_innersource.operations:security_present: + inputs: + repo: + - git_repo_checkout: repo + dffml_operations_innersource.operations:support_present: + inputs: + repo: + - git_repo_checkout: repo + dffml_operations_innersource.operations:unmaintained: + inputs: + results: + - group_by: output + git_commits: + inputs: + branch: + - git_repo_default_branch: branch + repo: + - clone_git_repo: repo + start_end: + - quarters_back_to_date: start_end + git_repo_author_lines_for_dates: + inputs: + branch: + - git_repo_default_branch: branch + repo: + - clone_git_repo: repo + start_end: + - quarters_back_to_date: start_end + git_repo_checkout: + inputs: + commit: + - git_repo_commit_from_date: commit + repo: + - clone_git_repo: repo + git_repo_commit_from_date: + inputs: + branch: + - git_repo_default_branch: branch + date: + - quarters_back_to_date: date + repo: + - clone_git_repo: repo + git_repo_default_branch: + conditions: + - seed + inputs: + repo: + - clone_git_repo: repo + git_repo_release: + inputs: + branch: + - git_repo_default_branch: branch + repo: + - clone_git_repo: repo + start_end: + - quarters_back_to_date: start_end + group_by: + inputs: + spec: + - seed + lines_of_code_by_language: + conditions: + - dffml_operations_innersource.operations:badge_maintained: result + - dffml_operations_innersource.operations:badge_unmaintained: result + - dffml_operations_innersource.cli:ensure_tokei: result + inputs: + repo: + - git_repo_checkout: repo + lines_of_code_to_comments: + inputs: + langs: + - lines_of_code_by_language: lines_by_language + make_quarters: + inputs: + number: + - seed + quarters_back_to_date: + inputs: + date: + - dffml_operations_innersource.operations:get_current_datetime_as_git_date: result + number: + - make_quarters: quarters + work: + inputs: + author_lines: + - git_repo_author_lines_for_dates: author_lines +linked: true +operations: + alice.shouldi.contribute.cicd:cicd_action_library: + inputs: + action_file_paths: ActionYAMLFileWorkflowUnixStylePath + name: alice.shouldi.contribute.cicd:cicd_action_library + outputs: + result: IsCICDGitHubActionsLibrary + retry: 0 + stage: output + alice.shouldi.contribute.cicd:cicd_jenkins_library: + inputs: + groovy_file_paths: GroovyFileWorkflowUnixStylePath + name: alice.shouldi.contribute.cicd:cicd_jenkins_library + outputs: + result: IsCICDJenkinsLibrary + retry: 0 + stage: output + alice.shouldi.contribute.cicd:cicd_library: + inputs: + cicd_action_library: IsCICDGitHubActionsLibrary + cicd_jenkins_library: IsCICDJenkinsLibrary + name: alice.shouldi.contribute.cicd:cicd_library + outputs: + result: CICDLibrary + retry: 0 + stage: output + check_if_valid_git_repository_URL: + inputs: + URL: URL + name: check_if_valid_git_repository_URL + outputs: + valid: valid_git_repository_URL + retry: 0 + stage: processing + cleanup_git_repo: + inputs: + repo: git_repository + name: cleanup_git_repo + outputs: {} + retry: 0 + stage: cleanup + clone_git_repo: + conditions: + - valid_git_repository_URL + inputs: + URL: URL + ssh_key: git_repo_ssh_key + name: clone_git_repo + outputs: + repo: git_repository + retry: 0 + stage: processing + count_authors: + inputs: + author_lines: author_line_count + name: count_authors + outputs: + authors: author_count + retry: 0 + stage: processing + dffml_feature_git.feature.operations:git_grep: + inputs: + repo: git_repository + search: git_grep_search + name: dffml_feature_git.feature.operations:git_grep + outputs: + found: git_grep_found + retry: 0 + stage: processing + dffml_operations_innersource.cli:ensure_tokei: + inputs: {} + name: dffml_operations_innersource.cli:ensure_tokei + outputs: + result: str + retry: 0 + stage: processing + dffml_operations_innersource.cli:github_repo_id_to_clone_url: + inputs: + repo_id: GitHubRepoID + name: dffml_operations_innersource.cli:github_repo_id_to_clone_url + outputs: + result: URL + retry: 0 + stage: processing + dffml_operations_innersource.operations:action_yml_files: + expand: + - result + inputs: + repo: git_repository_checked_out + name: dffml_operations_innersource.operations:action_yml_files + outputs: + result: ActionYAMLFileWorkflowUnixStylePath + retry: 0 + stage: processing + dffml_operations_innersource.operations:badge_maintained: + conditions: + - bool + inputs: {} + name: dffml_operations_innersource.operations:badge_maintained + outputs: + result: str + retry: 0 + stage: output + dffml_operations_innersource.operations:badge_unmaintained: + conditions: + - bool + inputs: {} + name: dffml_operations_innersource.operations:badge_unmaintained + outputs: + result: str + retry: 0 + stage: output + dffml_operations_innersource.operations:code_of_conduct_present: + inputs: + repo: git_repository_checked_out + name: dffml_operations_innersource.operations:code_of_conduct_present + outputs: + result: FileCodeOfConductPresent + retry: 0 + stage: processing + dffml_operations_innersource.operations:contributing_present: + inputs: + repo: git_repository_checked_out + name: dffml_operations_innersource.operations:contributing_present + outputs: + result: FileContributingPresent + retry: 0 + stage: processing + dffml_operations_innersource.operations:get_current_datetime_as_git_date: + inputs: {} + name: dffml_operations_innersource.operations:get_current_datetime_as_git_date + outputs: + result: quarter_start_date + retry: 0 + stage: processing + dffml_operations_innersource.operations:github_workflows: + expand: + - result + inputs: + repo: git_repository_checked_out + name: dffml_operations_innersource.operations:github_workflows + outputs: + result: GitHubActionsWorkflowUnixStylePath + retry: 0 + stage: processing + dffml_operations_innersource.operations:groovy_files: + expand: + - result + inputs: + repo: git_repository_checked_out + name: dffml_operations_innersource.operations:groovy_files + outputs: + result: GroovyFileWorkflowUnixStylePath + retry: 0 + stage: processing + dffml_operations_innersource.operations:jenkinsfiles: + expand: + - result + inputs: + repo: git_repository_checked_out + name: dffml_operations_innersource.operations:jenkinsfiles + outputs: + result: JenkinsfileWorkflowUnixStylePath + retry: 0 + stage: processing + dffml_operations_innersource.operations:maintained: + inputs: + results: group_by_output + name: dffml_operations_innersource.operations:maintained + outputs: + result: bool + retry: 0 + stage: output + dffml_operations_innersource.operations:readme_present: + inputs: + repo: git_repository_checked_out + name: dffml_operations_innersource.operations:readme_present + outputs: + result: FileReadmePresent + retry: 0 + stage: processing + dffml_operations_innersource.operations:security_present: + inputs: + repo: git_repository_checked_out + name: dffml_operations_innersource.operations:security_present + outputs: + result: FileSecurityPresent + retry: 0 + stage: processing + dffml_operations_innersource.operations:support_present: + inputs: + repo: git_repository_checked_out + name: dffml_operations_innersource.operations:support_present + outputs: + result: FileSupportPresent + retry: 0 + stage: processing + dffml_operations_innersource.operations:unmaintained: + inputs: + results: group_by_output + name: dffml_operations_innersource.operations:unmaintained + outputs: + result: bool + retry: 0 + stage: output + git_commits: + inputs: + branch: git_branch + repo: git_repository + start_end: date_pair + name: git_commits + outputs: + commits: commit_count + retry: 0 + stage: processing + git_repo_author_lines_for_dates: + inputs: + branch: git_branch + repo: git_repository + start_end: date_pair + name: git_repo_author_lines_for_dates + outputs: + author_lines: author_line_count + retry: 0 + stage: processing + git_repo_checkout: + inputs: + commit: git_commit + repo: git_repository + name: git_repo_checkout + outputs: + repo: git_repository_checked_out + retry: 0 + stage: processing + git_repo_commit_from_date: + inputs: + branch: git_branch + date: date + repo: git_repository + name: git_repo_commit_from_date + outputs: + commit: git_commit + retry: 0 + stage: processing + git_repo_default_branch: + conditions: + - no_git_branch_given + inputs: + repo: git_repository + name: git_repo_default_branch + outputs: + branch: git_branch + remote: git_remote + retry: 0 + stage: processing + git_repo_release: + inputs: + branch: git_branch + repo: git_repository + start_end: date_pair + name: git_repo_release + outputs: + present: release_within_period + retry: 0 + stage: processing + group_by: + inputs: + spec: group_by_spec + name: group_by + outputs: + output: group_by_output + retry: 0 + stage: output + lines_of_code_by_language: + conditions: + - str + inputs: + repo: git_repository_checked_out + name: lines_of_code_by_language + outputs: + lines_by_language: lines_by_language_count + retry: 0 + stage: processing + lines_of_code_to_comments: + inputs: + langs: lines_by_language_count + name: lines_of_code_to_comments + outputs: + code_to_comment_ratio: language_to_comment_ratio + retry: 0 + stage: processing + make_quarters: + expand: + - quarters + inputs: + number: quarters + name: make_quarters + outputs: + quarters: quarter + retry: 0 + stage: processing + quarters_back_to_date: + expand: + - date + - start_end + inputs: + date: quarter_start_date + number: quarter + name: quarters_back_to_date + outputs: + date: date + start_end: date_pair + retry: 0 + stage: processing + work: + inputs: + author_lines: author_line_count + name: work + outputs: + work: work_spread + retry: 0 + stage: processing +seed: +- definition: quarters + origin: seed + value: 10 +- definition: no_git_branch_given + origin: seed + value: true +- definition: group_by_spec + origin: seed + value: + ActionYAMLFileWorkflowUnixStylePath: + by: quarter + group: ActionYAMLFileWorkflowUnixStylePath + nostrict: true + FileCodeOfConductPresent: + by: quarter + group: FileCodeOfConductPresent + nostrict: true + FileContributingPresent: + by: quarter + group: FileContributingPresent + nostrict: true + FileReadmePresent: + by: quarter + group: FileReadmePresent + nostrict: true + FileSecurityPresent: + by: quarter + group: FileSecurityPresent + nostrict: true + FileSupportPresent: + by: quarter + group: FileSupportPresent + nostrict: true + GitHubActionsWorkflowUnixStylePath: + by: quarter + group: GitHubActionsWorkflowUnixStylePath + nostrict: true + GroovyFileWorkflowUnixStylePath: + by: quarter + group: GroovyFileWorkflowUnixStylePath + nostrict: true + JenkinsfileWorkflowUnixStylePath: + by: quarter + group: JenkinsfileWorkflowUnixStylePath + nostrict: true + author_line_count: + by: quarter + group: author_line_count + nostrict: true + commit_shas: + by: quarter + group: git_commit + nostrict: true + release_within_period: + by: quarter + group: release_within_period + nostrict: true + +``` + +
\ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0087/index.md b/docs/discussions/alice_engineering_comms/0087/index.md new file mode 100644 index 0000000000..eb6e369fe4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0087/index.md @@ -0,0 +1,4 @@ +# 2022-11-15 Engineering Logs + +- Exemplary docs + - https://cve-bin-tool.readthedocs.io/en/latest/CONTRIBUTING.html#running-tests \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0087/reply_0000.md b/docs/discussions/alice_engineering_comms/0087/reply_0000.md new file mode 100644 index 0000000000..9b7a140aff --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0087/reply_0000.md @@ -0,0 +1,254 @@ + ## 2022-11-15 @pdxjohnny Engineering Logs + +- https://docs.joinmastodon.org/spec/activitypub/ +- https://docs.joinmastodon.org/dev/setup/ + - > In the development environment, Mastodon will use PostgreSQL as the currently signed-in Linux user using the `ident` method, which usually works out of the box. The one command you need to run is rails `db:setup` which will create the databases `mastodon_development` and `mastodon_test`, load the schema into them, and then create seed data defined in `db/seed.rb` in `mastodon_development`. The only seed data is an admin account with the credentials `admin@localhost:3000` / `mastodonadmin`. + - We'll change the `.env.production` user to match +- https://github.com/felx/mastodon-documentation/blob/master/Running-Mastodon/Docker-Guide.md + +**.env.production** + +```bash +# Generated with mastodon:setup on 2022-11-15 14:37:27 UTC + +# Some variables in this file will be interpreted differently whether you are +# using docker-compose or not. + +LOCAL_DOMAIN=localhost +SINGLE_USER_MODE=false +SECRET_KEY_BASE=1c60ddccf21afd66e355a85621767feb1ffe47d1b9ac9e8bab5ef283a0fa6c1cc9e7015409bb645551ef7ab4b9f09aed90069640e91500f0009887509d2e1f4f +OTP_SECRET=376e8655790cc05d973d6d427e1e37f98cee9ebc91f6c33eda6243b650fd8f8531a34a43d4c0d62940db6064ea8bdce581d11ff7a22e4ec81f7ffedaad0ad26f +VAPID_PRIVATE_KEY=M7FtL40N4rJ2BtbtyWFHN9b1jaWD4x8p2Pab-FGGb3M= +VAPID_PUBLIC_KEY=BP_BPQEpiSuv0Qri0XWSr54MC0ug5hHb905PPRLufPhu13QCF3D86cW3ReFnZ411VoDB5lDfuntBmYU0Ku65oVs= +DB_HOST=db +DB_PORT=5432 +DB_NAME=mastodon_development +DB_USER=admin +DB_PASS=mastodonadmin +REDIS_HOST=redis +REDIS_PORT=6379 +REDIS_PASSWORD= +SMTP_SERVER=localhost +SMTP_PORT=25 +SMTP_AUTH_METHOD=none +SMTP_OPENSSL_VERIFY_MODE=none +SMTP_ENABLE_STARTTLS=auto +SMTP_FROM_ADDRESS=Mastodon +``` + +```console +$ grep POSTGRES_ docker-compose.yml + - 'POSTGRES_DB=mastodon_development' + - 'POSTGRES_USER=admin' + - 'POSTGRES_PASSWORD=mastodonadmin' +$ time podman-compose run -e DISABLE_DATABASE_ENVIRONMENT_CHECK=1 web rails db:setup +... ... +$ time podman-compose run web bundle exec rake db:migrate +$ podman-compose up +$ curl -H "Host: https://localhost:3000/" -v http://localhost:3000/ +* Trying 127.0.0.1:3000... +* Connected to localhost (127.0.0.1) port 3000 (#0) +> GET / HTTP/1.1 +> Host: https://localhost:3000/ +> User-Agent: curl/7.85.0 +> Accept: */* +> +* Mark bundle as not supporting multiuse +< HTTP/1.1 403 Forbidden +< Content-Type: text/html; charset=UTF-8 +< Content-Length: 0 +< +* Connection #0 to host localhost left intact +``` + +```console +$ podman-compose run web -e RAILS_ENV=production bin/tootctl accounts modify alice --role Owner +$ podman-compose run web -e RAILS_ENV=production bin/tootctl accounts create \ + alice \ + --email alice@chadig.com \ + --confirmed \ + --role Owner +``` + +- Okay giving up on Mastodon spin up, RSS feeds (+websub) probably best for SBOM and VEX + streams anyway. +- References + - https://github.com/BasixKOR/awesome-activitypub + - https://github.com/dariusk/rss-to-activitypub + - https://www.w3schools.com/xml/xml_rss.asp + - https://github.com/chainfeeds/RSSAggregatorforWeb3 + - Here's a possible basis for our web2 -> web3/5 + - https://github.com/RoCry/feedcooker/releases/tag/latest + - https://github.com/RoCry/feedcooker/releases/download/latest/Rust_News.xml + - https://github.com/RoCry/feedcooker/issues/1 + - This is a nice aggregator we could use in the future + - https://github.com/actionsflow/actionsflow-workflow-default + - GitHub Actions workflows can trigger from RSS feeds via this third party framework + not clear if it pools or not. websub and publish / serialize / configloader for + `dffml dataflow run records set` output as RSS feed? + - https://actionsflow.github.io/ + - https://mastodon.social/@pdxjohnny.rss + - Example posted below + - https://twit.social/@jr/109348004478960008 + - https://twit.social/tags/android.rss + - Very cool Mastodon will serve RSS feeds for tags. + - This would allow us to reply to tweets with given tags + and then automated determine provenance (see deepfake detection), + and reply with estimated provenance via SBOM / VEX with SCITT + recpits encoded into (didme.me) image in response (or if + we can put the CBOR in a JWK claim maybe that would serialize + to a stupidly long string, then encode that to an image?) + - https://mastodon.social/tags/scitt.rss + - It would be nice if there was a multi-tag URL. + - Example: https://mastodon.social/tags/alice,scitt,vex.rss + - Example: https://mastodon.social/tags/scitt,vex.rss + - Example: https://mastodon.social/tags/scitt,sbom.rss + +```xml + + + + John + Public posts from @pdxjohnny@mastodon.social + https://mastodon.social/@pdxjohnny + + https://files.mastodon.social/accounts/avatars/000/032/591/original/9c6c698d572049b4.jpeg + John + https://mastodon.social/@pdxjohnny + + Tue, 15 Nov 2022 16:18:15 +0000 + https://files.mastodon.social/accounts/avatars/000/032/591/original/9c6c698d572049b4.jpeg + Mastodon v4.0.2 + + https://mastodon.social/@pdxjohnny/109348722777644811 + https://mastodon.social/@pdxjohnny/109348722777644811 + Tue, 15 Nov 2022 16:18:15 +0000 + <p>RSS VEX feeds?</p><p><a href="https://twit.social/@jr/109345573865828477" target="_blank" rel="nofollow noopener noreferrer"><span class="invisible">https://</span><span class="ellipsis">twit.social/@jr/10934557386582</span><span class="invisible">8477</span></a></p><p>2022-11-15 Engineering Logs: <a href="https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4146655" target="_blank" rel="nofollow noopener noreferrer"><span class="invisible">https://</span><span class="ellipsis">github.com/intel/dffml/discuss</span><span class="invisible">ions/1406?sort=new#discussioncomment-4146655</span></a></p> + + + https://mastodon.social/@pdxjohnny/109320563491316354 + https://mastodon.social/@pdxjohnny/109320563491316354 + Thu, 10 Nov 2022 16:56:58 +0000 + <p>The Alice thread continues!</p><p>We take one step further towards decentralization as we federate our way away from Twitter.</p><p>Today we&#39;re playing with SCITT and ATProto: <a href="https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4104302" target="_blank" rel="nofollow noopener noreferrer"><span class="invisible">https://</span><span class="ellipsis">github.com/intel/dffml/discuss</span><span class="invisible">ions/1406?sort=new#discussioncomment-4104302</span></a></p><p>Prev: <a href="https://twitter.com/pdxjohnny/status/1585488415864557568" target="_blank" rel="nofollow noopener noreferrer"><span class="invisible">https://</span><span class="ellipsis">twitter.com/pdxjohnny/status/1</span><span class="invisible">585488415864557568</span></a></p> + + + +``` + +- We could also httptest NIST API + - https://github.com/intel/cve-bin-tool/issues/2334 + - Looks like 7 days ago cve-bin-tool community themselves (Terri in this case :) highlighed a similar need! + - Trying to run tests + - Need `NVD_API_KEY` + - Request via email activation flow https://nvd.nist.gov/developers/request-an-api-key + - Link in email to activation page (10 minute email websub rss? -> ATP) + - Grab UUID which is token off page + +```console +$ nvd_api_key=$NVD_API_KEY LONG_TESTS=1 python -um pytest -v --log-level=DEBUG --log-cli-level=DEBUG test/test_nvd_api.py 2>&1 | gh gist create -p -d 'Failure to launch NVD API tests: https://github.com/intel/cve-bin-tool/issues/2334' +``` + +- Output of above command: https://gist.github.com/pdxjohnny/dcfaecadd743e773c8aed3e1d323e0bd + - `$ REC_TITLE="httptest NIST API: 2022-11-15 @pdxjohnny Engineering Logs: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4146655" exec bash` + - https://github.com/pdxjohnny/dotfiles/blob/ccccfe8f55729bab6f00573a0b3c0358a3a77cf9/.asciinema_source + - `$ unxz -d < ~/asciinema/fedora-rec-2022-11-15T10:05:02-08:00.json.xz | python -m asciinema upload /dev/stdin` + - `$ unxz -d < $(ls ~/asciinema/$(hostname)-rec-* | tail -n 1) | python -m asciinema upload /dev/stdin` + +[![asciicast-of-failure-to-run-test_nvd_api](https://asciinema.org/a/537871.svg)](https://asciinema.org/a/537871) + +[![asciicast](https://asciinema.org/a/537888.svg)](https://asciinema.org/a/537888) + +- Got the NVD tests parameterized to versions 1 and 2. + +```diff +diff --git a/cve_bin_tool/nvd_api.py b/cve_bin_tool/nvd_api.py +index 6245c56..d151cd1 100644 +--- a/cve_bin_tool/nvd_api.py ++++ b/cve_bin_tool/nvd_api.py +@@ -139,7 +139,7 @@ class NVD_API: + + if self.invalid_api: + self.logger.warning( +- f'Unable to access NVD using provided API key: {self.params["apiKey"]}' ++ f'Unable to access NVD using provided API key: {self.params.get("apiKey", "NO_API_KEY_GIVEN")}' + ) + else: + if time_of_last_update: +diff --git a/test/test_nvd_api.py b/test/test_nvd_api.py +index 29f14e9..109815c 100644 +--- a/test/test_nvd_api.py ++++ b/test/test_nvd_api.py +@@ -8,6 +8,7 @@ from datetime import datetime, timedelta + from test.utils import LONG_TESTS + + import pytest ++import aiohttp + + from cve_bin_tool.cvedb import CVEDB + from cve_bin_tool.data_sources import nvd_source +@@ -42,14 +43,24 @@ class TestNVD_API: + LONG_TESTS() != 1 or not os.getenv("nvd_api_key"), + reason="NVD tests run only in long tests", + ) +- async def test_total_results_count(self): ++ @pytest.mark.parametrize( ++ "api_version, feed", ++ [ ++ ("1.0", None), ++ ("2.0", None), ++ ], ++ ) ++ async def test_total_results_count(self, api_version, feed): + """Total results should be greater than or equal to the current fetched cves""" +- nvd_api = NVD_API(api_key=os.getenv("nvd_api_key") or "") +- await nvd_api.get_nvd_params( +- time_of_last_update=datetime.now() - timedelta(days=2) +- ) +- await nvd_api.get() +- assert len(nvd_api.all_cve_entries) >= nvd_api.total_results ++ async with aiohttp.ClientSession() as session: ++ nvd_api = NVD_API(api_key=os.getenv("nvd_api_key") or "", ++ session=session) ++ nvd_api.logger.info("api_version: %s, feed: %s", api_version, feed) ++ await nvd_api.get_nvd_params( ++ time_of_last_update=datetime.now() - timedelta(days=2) ++ ) ++ await nvd_api.get() ++ assert len(nvd_api.all_cve_entries) >= nvd_api.total_results + + @pytest.mark.asyncio + @pytest.mark.skipif( +``` + +[![asciicast](https://asciinema.org/a/537921.svg)](https://asciinema.org/a/537921) + +[![asciicast](https://asciinema.org/a/537925.svg)](https://asciinema.org/a/537925) + +[![asciicast-stash-p](https://asciinema.org/a/537931.svg)](https://asciinema.org/a/537931) + +- Reverse engineering NIST API by dumping request response + +[![asciicast](https://asciinema.org/a/537936.svg)](https://asciinema.org/a/537936) + +```console +$ gh gist create -p -d 'intel/cve-bin-tool: tests: add tests for NVD 2.0 API: https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4146655' /tmp/feed-f232077c4b0644a8f77acb0c63c3d30bb59eff3be774e3e37d00c7b15cfe95079d8d80b48fede725a2f0f19cba0c9496-params.json /tmp/feed-f232077c4b0644a8f77acb0c63c3d30bb59eff3be774e3e37d00c7b15cfe95079d8d80b48fede725a2f0f19cba0c9496.json /tmp/stats.json /tmp/feed-e459d6f8805bad4c8f3097dd5071732478d08e2a6ad50c734199bc24983f49c2d1567ea11bbf2993de662af4736113c4-params.json /tmp/feed-e459d6f8805bad4c8f3097dd5071732478d08e2a6ad50c734199bc24983f49c2d1567ea11bbf2993de662af4736113c4.json /tmp/validate-283492d554c095740c199f739dd4944bfab86a6db800993e16494209c1420061fe7c0e174570715ff7bd9132d26e9b47* +``` + +- Dumped request response format: https://gist.github.com/pdxjohnny/599b453dffc799f1c4dd8d8024b0f60e +- Started on https://github.com/pdxjohnny/httptest server + +[![asciicast](https://asciinema.org/a/537938.svg)](https://asciinema.org/a/537938) + +- TODO + - [ ] ~~Spin up Mastodon~~ + - [ ] Investigate https://docs.joinmastodon.org/spec/webfinger/#example + - [ ] NIST vuln feed as VEX/VDR API via httptest then integrate as additional vuln feed to cve-bin-tool then publish to via another project (pytss) then to rss then rss-to-activitypub and then see if that integrates with Mastodon then rss to web3/5 + - If we can get something federated working then Alice can send SBOM and VEX updates + - https://github.com/intel/cve-bin-tool/pull/1698 +- Future + - [ ] Reuse ephemeral ssh server spun up across data flows running on different hosts + - [ ] Document asciicast-stash-p https://asciinema.org/a/537931 as refactoring method + - [ ] Multi context logging (multiple Sources? in output query / method / data flow as class?) + - Examples + - Posting updates on status of CVE Bin Tool VEX via NVD API style feed + as well as https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0088/index.md b/docs/discussions/alice_engineering_comms/0088/index.md new file mode 100644 index 0000000000..c1902f354d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0088/index.md @@ -0,0 +1 @@ +# 2022-11-16 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0088/reply_0000.md b/docs/discussions/alice_engineering_comms/0088/reply_0000.md new file mode 100644 index 0000000000..bcee618db5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0088/reply_0000.md @@ -0,0 +1,136 @@ +## 2022-11-16 @pdxjohnny Engineering Logs + +- NVD API style as first way to distribute VEX. + - ActivityPub publish as well + - Websub for new notifications? Look up how Mastodon does. +- Working on cve-bin-tool https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093 + - We're reverse engineering the NIST NVD API to serve VEX. + - The following logs/recordings can be useful in learning how to reverse + engineer an HTTP based protocol to implement a similar server. + - This becomes the base layer for communication in our decentralized CI/CD + aka DFFML plugin land, aka poly repo land, aka the real world, aka Wonderland. + - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#what-is-alice + - [service: sw: src: change: notify: Service to facilitate poly repo pull model dev tooling #1315](https://github.com/intel/dffml/issues/1315#issuecomment-1066814280) + - Vuln management is a MUST implement channel we can use for patch submission + and comms for alignment between entities. + - We're hitting this open issue while were at it. +- Got basic stats response saved from cache working + - Cache: https://gist.github.com/pdxjohnny/599b453dffc799f1c4dd8d8024b0f60e + - Got serving feed working with same page requested over fails test (as it should, paging broken currently, next is fix that). + - [gist: Python example pagination client and server](https://gist.github.com/pdxjohnny/47a6ddcd122a8f693ef346153708525a) +- Side note: This asciinema was 12 MB uncut so I had to trim it up a bit + +[![asciicast](https://asciinema.org/a/538130.svg)](https://asciinema.org/a/538130) + +- httptest NIST API single CVE import working + +[![asciicast](https://asciinema.org/a/538136.svg)](https://asciinema.org/a/538136) + +[![asciicast](https://asciinema.org/a/538143.svg)](https://asciinema.org/a/538143) + +- Pagnation asciicast (too big, 12 MB decompressed) + - [nvd-pagenation.json.txt](https://github.com/intel/dffml/files/10023980/nvd-pagenation.json.txt) + +```console +$ unxz -d < $(ls ~/asciinema/fedora-rec-* | tail -n 1) | dd if=/dev/stdin of=/dev/null status=progress +24117+1 records in +24117+1 records out +12348069 bytes (12 MB, 12 MiB) copied, 0.0500872 s, 247 MB/s +``` + +- Basic server seems to be working for v1 API +- Added CLI command `alice threats vulns serve nvdstyle` + - https://github.com/intel/dffml/commit/cb2c09ead795ba0046cb5911bcd6e939419058d8 + +https://github.com/intel/dffml/blob/4101595a800e74f57cec5537ea2c65680135b71a/entities/alice/alice/threats/vulns/serve/nvdstyle.py#L1-L241 + +- https://www.darkreading.com/dr-tech/cybersecurity-nutrition-labels-still-a-work-in-progress + - https://www.whitehouse.gov/briefing-room/statements-releases/2022/10/20/statement-by-nsc-spokesperson-adrienne-watson-on-the-biden-harris-administrations-effort-to-secure-household-internet-enabled-devices/ + - > Yesterday, the White House convened leaders from the private sector, academic institutions, and the U.S. Government to advance a national cybersecurity labeling program for Internet-of-Things (IoT) devices. The Biden-Harris Administration has made it a priority to strengthen our nation’s cybersecurity, and a key part of that effort is ensuring the devices that have become a commonplace in the average American household – like baby monitors or smart home appliances – are protected from cyber threats. A labeling program to secure such devices would provide American consumers with the peace of mind that the technology being brought into their homes is safe, and incentivize manufacturers to meet higher cybersecurity standards and retailers to market secure devices. + > + > Yesterday’s dialogue focused on how to best implement a national cybersecurity labeling program, drive improved security standards for Internet-enabled devices, and generate a globally recognized label. Government and industry leaders discussed the importance of a trusted program to increase security across consumer devices that connect to the Internet by equipping devices with easily recognized labels to help consumers make more informed cybersecurity choices (e.g., an “EnergyStar” for cyber). These conversations build on the foundational work that has been pioneered by the private sector and the National Institute of Standards and Technology (NIST) to help build more secure Internet-connected devices. It also follows President Biden’s Executive Order on Improving the Nation’s Cybersecurity, which highlighted the need for improved IoT security and tasked NIST, in partnership with the Federal Trade Commission, to advance improved cybersecurity standards and standardized product labels for these devices. + - Related: `$ grep DNA` +- https://csrc.nist.gov/publications/detail/white-paper/2022/11/09/implementing-a-risk-based-approach-to-devsecops/final + - > DevOps brings together software development and operations to shorten development cycles, allow organizations to be agile, and maintain the pace of innovation while taking advantage of cloud-native technology and practices. Industry and government have fully embraced and are rapidly implementing these practices to develop and deploy software in operational environments, often without a full understanding and consideration of security. Also, most software today relies on one or more third-party components, yet organizations often have little or no visibility into and understanding of how these components are developed, integrated, deployed, and maintained, as well as the practices used to ensure the components’ security. To help improve the security of DevOps practices, the NCCoE is planning a DevSecOps project that will focus initially on developing and documenting an applied risk-based approach and recommendations for secure DevOps and software supply chain practices consistent with the Secure Software Development Framework (SSDF), Cybersecurity Supply Chain Risk Management (C-SCRM), and other NIST, government, and industry guidance. This project will apply these DevSecOps practices in proof-of-concept use case scenarios that will each be specific to a technology, programming language, and industry sector. Both closed source (proprietary) and open source technology will be used to demonstrate the use cases. This project will result in a freely available NIST Cybersecurity Practice Guide. +- https://www.intel.com/content/www/us/en/newsroom/news/2022-intel-innovation-day-2-livestream-replay.html#gs.djq36o + - Similar to the software labeling, with Alice we are trying to cross these streams + - Datasheets for Datasets + - https://arxiv.org/abs/1803.09010 + - > The machine learning community currently has no standardized process for documenting datasets, which can lead to severe consequences in high-stakes domains. To address this gap, we propose datasheets for datasets. In the electronics industry, every component, no matter how simple or complex, is accompanied with a datasheet that describes its operating characteristics, test results, recommended uses, and other information. By analogy, we propose that every dataset be accompanied with a datasheet that documents its motivation, composition, collection process, recommended uses, and so on. Datasheets for datasets will facilitate better communication between dataset creators and dataset consumers, and encourage the machine learning community to prioritize transparency and accountability. + +> Side from Andrew Ng's Intel Innovation 2022 Luminary Keynote +> Source: https://www.intel.com/content/www/us/en/newsroom/news/2022-intel-innovation-day-2-livestream-replay.html#gs.iex8mr +> ![image](https://user-images.githubusercontent.com/5950433/193330714-4bcceea4-4402-468f-82a9-51882939452c.png) + +- Possible alignment with Andrew's "Data-Centric AI" + - is the discipline of systematically engineering the data used to build an AI system + - This is what we're doing with Alice +- Possible alignment with Andrew's "The iterative process of ML development" + - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#entity-analysis-trinity + - Intent / Train model + - Establish correlations between threat model intent and collected data / errors (telemetry or static analysis, policy, failures) + - Dynamic analysis / Improve data + - We tweak the code to make it do different things to see different data. The application of overlays. Think over time. + - Static / Error analysis + - There might be async debug initiated here but this maps pretty nicely conceptually since we'd think of this as a static process, we already have some errors to analyze if we're at this step. + +![Entity Analysis Trinity](https://user-images.githubusercontent.com/5950433/188203911-3586e1af-a1f6-434a-8a9a-a1795d7a7ca3.svg) + +- Gist for v2 API call cached: https://gist.github.com/pdxjohnny/ab1bf170dce272cecdd317eae55d1174 +- TODO + - [ ] Clean up SCITT OpenSSF Use Case + - https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md + - https://mailarchive.ietf.org/arch/msg/scitt/cxRvcTEUNEhlxE_AJyspdx9y06w/ + - [ ] Get back to Kate + - [ ] SCIIT for NVD style feed data + - [ ] Patch CVE Bin Tool to support validation + - See Dick Brooks's email: https://mailarchive.ietf.org/arch/msg/scitt/cxRvcTEUNEhlxE_AJyspdx9y06w/ + - > Ray’s statement: “I can't imagine that you could ask some other + > entity other than the mfr that created the device + > to provide the reference, and attest to it's validity.” + > + > This is also true for software vulnerabilities. Only the software product developer has access to the source code needed to answer the question, “Is my software product vulnerable to exploitation by CVE-XYZ?” + > + > This is what a NIST VDR provides – a vulnerability disclosure report from a software owner to a customer indicating the vulnerability status of their product at the SBOM component level; + > - https://energycentral.com/c/pip/what-nist-sbom-vulnerability-disclosure-report-vdr + > + > Software vendors provide links to attestations using a Vendor Repose File (VRF), which is yet another artifact that needs to be checked for trustworthiness: + > + > - https://energycentral.com/c/pip/advice-software-vendors-prepare-omb-m-22-18-requirements + > + > The VDR and VRF are both considered artifacts, which the author is making a statement of trustworthiness, that needs to be vetted by a trusted party, resulting in a claim that gets placed into a trusted registry becoming a “transparent claim” in a SCITT registry. + > + > A consumer should be able to query the trustworthiness of the VDR and VRF artifacts using a SCITT Transparency Service, having nothing more than the original VDR and VRF artifacts in their possession. + - SCITT is awesome because it supports this offline verification + which is important for us with Alice because we will be running + in parallel/concurrently across many instances of her. These will + sometimes compute fully offline (offline RL?). Therefore we want to + be able to check validity of data before handing off to EDEN nodes + which might loose connection. This enables them to verify offline + data push updated in their cache. This allows entities to act in + accordance with strategic principles by validating data on entry, + producing receipts offline, and then rejoining those to the other + nodes receiving those input streams. They need to have these offline + recpeits when they produce recepits for new input to maintain provenance + chains (collecting data for inference within a flow running across multiple + EDEN nodes doing active learning based on perceived trustworthyness of inputs). + - [ ] Buy fully working mouse + - [ ] Buy mousepad + - [ ] Practice on ergonomic keyboard + - [ ] gif of AOE1 install building for github.com/pdxjohnny/pdxjohnny/README.md + - [ ] Communicate to Alice she MUST stop creating double issues with todos command + - Fix the bug + - [ ] SBOM, VEX, etc. feeds to ActivityPub, websub, RSS, web5 (ATP Data Repositories or if W3C or DIF has something) + - [ ] Rebuild on trigger +- Future + - [ ] Auto sync asciinema recs / stream to https://github.com/asciinema/asciinema-server + - [ ] Conversion to SBOM, VEX, etc. feeds + - [ ] Coder demo / templates + - Workspace / template as server + - [ ] Pull request Atuin to not change the way the up arrow works + - [ ] Respond to https://mailarchive.ietf.org/arch/msg/scitt/fg6_z2HauVl5d6mklUnMQivE57Y/ + and see if we can collaberate. + - [ ] Auto sync Atuin https://github.com/ellie/atuin/blob/main/docs/server.md + - [ ] Conversion to SBOM, VEX, etc. feeds + - [ ] Coder demo / templates + - Workspace / template as server \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0088/reply_0001.md b/docs/discussions/alice_engineering_comms/0088/reply_0001.md new file mode 100644 index 0000000000..96a6e97083 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0088/reply_0001.md @@ -0,0 +1,22 @@ +## 2022-11-16 Portland Linux Kernel November meetup + +- https://www.meetup.com/portland-linux-kernel-meetup/events/289592627/ +- Talked to Andy most of the time (x86, kvm nested) + - Asked him what he's excited about + - He's stoked on profiling and perf counters, good stuff to be stoked on. + - Mentioned ptrace, instruction count per cycle I think, can't quite remember. + - Told him will circle back once we are to retriggering for regressions. +- Semantic grep +- https://www.kernel.org/doc/html/v6.0/dev-tools/coccinelle.html + - Idea is to infer what the input to coccinelle is (figure out appropriate semantic patch) +- Gave example of three developers working on different branches in different repos. + Yes we aren't supposed to have long lived feature branches, but if you have three + short lived dev branches you're still here. + - Alice works in the background constantly trying to find the "state of the art" + for the combination of those branches. + - Alice is always trying to ensure you're working off the context local dynamic + state of the art, LIVE at HEAD for decentralized development. + - Git allows your source control to be decentralized but this allows yo + to take full advantage of that, grep A/B testing rebase cherry-pick all + permutations (how dataflows already call operations, grep for food / recipe + example). \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0089/index.md b/docs/discussions/alice_engineering_comms/0089/index.md new file mode 100644 index 0000000000..3eb11a11c0 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0089/index.md @@ -0,0 +1 @@ +# 2022-11-17 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0089/reply_0000.md b/docs/discussions/alice_engineering_comms/0089/reply_0000.md new file mode 100644 index 0000000000..5fe7c44b63 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0089/reply_0000.md @@ -0,0 +1,384 @@ +## 2022-11-17 @pdxjohnny Engineering Logs + +- Verifiable Credentials + - https://verite.id/verite/appendix/primer + - https://github.com/uport-project/veramo + - +- OIDC + - https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#getting-started-with-oidc +- docs/arch/alice/discussion/0001/reply_0007.md BJJ analogy, land in Coach Alice? +- Alignment + - GSoC rubric as way of grading proposed compute contract / + engagement / manifest (instance) / work item / GitHub issue / work. + - https://dffml.github.io/dffml-pre-image-removal/contributing/gsoc/rubric.html + +![dffml-gsoc-grading-rubric-table](https://user-images.githubusercontent.com/5950433/202493540-90b52a01-337a-4098-a102-021fe338372d.png) + +https://github.com/intel/dffml/blob/3530ee0d20d1062605f82d1f5055f455f8c2c68f/docs/contributing/gsoc/rubric.rst#L1-L134 + +- This thread stopped working / loading on my phone :( + - Light laptop also apparently crumbling under weight of GitHub rendered thread +- Thread needs to become something VEX/SBOM/WEB3/5 soon + - Very soon this is unusable. one things fixed (Linux PC) and another thing breaks + the thread. Such is the life of those of Chaos. +- PWA with root of trust as brave wallet? + - Offline sync of data with provenance by local SCITT with root of trust to brave wallet. + - See "SCITT for NVD style feed data" children/downstream(links)/sub-bullet points (trying to figure out most ergonomic wording, child parent is antiquated/not descriptive enough (it's a one to many when looking from bulletpoint item at ancestry, tree, knowledge graph, links) with online cloning so we need to keep thinking) [2022-11-16 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4157129) + - https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md + - > As a follow on to the OpenSSF Metrics use case document and [Living Threat Models are better than Dead Threat Models](https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw) [Rolling Alice: Volume 1: Coach Alice: Chapter 1: Down the Dependency Rabbit-Hole Again](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md) will cover how we identify and query provenance on dependencies where caching on data flow execution is assisted via quering public SCITT infrastructure and sourcing cached state from trustworthy parties. + +```console +$ dffml service dev export -configloader json alice.cli:AlicePleaseLogTodosCLIDataFlow | tee logtodos.json && (echo '```mermaid' && dffml dataflow diagram logtodos.json && echo '```') | gh gist create -f "LOG_TODOS_DATAFLOW_DIAGRAM.md" -` +``` + +**alice.cli:AlicePleaseLogTodosCLIDataFlow** + +```mermaid +graph TD +subgraph a759a07029077edc5c37fea0326fa281[Processing Stage] +style a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a +subgraph d9f2c7ced7f00879629c15363c8e307d[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url] +style d9f2c7ced7f00879629c15363c8e307d fill:#fff4de,stroke:#cece71 +37178be7db9283b44a1786fef58ffa8d[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url] +5c7743e872c165030dcf051c712106fc(repo_string) +5c7743e872c165030dcf051c712106fc --> 37178be7db9283b44a1786fef58ffa8d +8d32e3f614b2c8f9d23e7469eaa1da12(result) +37178be7db9283b44a1786fef58ffa8d --> 8d32e3f614b2c8f9d23e7469eaa1da12 +end +subgraph ed8e05e445eabbcfc1a201e580b1371e[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guessed_repo_string_is_operations_git_url] +style ed8e05e445eabbcfc1a201e580b1371e fill:#fff4de,stroke:#cece71 +f129d360149fb01bbfe1ed8c2f9bbaa2[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guessed_repo_string_is_operations_git_url] +77a8695545cb64a7becb9f50343594c3(repo_url) +77a8695545cb64a7becb9f50343594c3 --> f129d360149fb01bbfe1ed8c2f9bbaa2 +d259a05785074877b9509ed686e03b3a(result) +f129d360149fb01bbfe1ed8c2f9bbaa2 --> d259a05785074877b9509ed686e03b3a +end +subgraph 0fb0b360e14eb7776112a5eaff5252de[alice.please.log.todos.todos.OverlayCLI:cli_has_repos] +style 0fb0b360e14eb7776112a5eaff5252de fill:#fff4de,stroke:#cece71 +81202a774dfaa2c4d640d25b4d6c0e55[alice.please.log.todos.todos.OverlayCLI:cli_has_repos] +7ba42765e6fba6206fd3d0d7906f6bf3(cmd) +7ba42765e6fba6206fd3d0d7906f6bf3 --> 81202a774dfaa2c4d640d25b4d6c0e55 +904eb6737636f1d32a6d890f449e9081(result) +81202a774dfaa2c4d640d25b4d6c0e55 --> 904eb6737636f1d32a6d890f449e9081 +end +subgraph 964c0fbc5f3a43fce3f0d9f0aed08981[alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo] +style 964c0fbc5f3a43fce3f0d9f0aed08981 fill:#fff4de,stroke:#cece71 +b96195c439c96fa7bb4a2d616bbe47c5[alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo] +2a071a453a1e677a127cee9775d0fd9f(cmd) +2a071a453a1e677a127cee9775d0fd9f --> b96195c439c96fa7bb4a2d616bbe47c5 +f6bfde5eece6eb52bb4b4a3dbc945d9f(result) +b96195c439c96fa7bb4a2d616bbe47c5 --> f6bfde5eece6eb52bb4b4a3dbc945d9f +end +subgraph 2e2e8520e9f9420ffa9e54ea29965019[alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo] +style 2e2e8520e9f9420ffa9e54ea29965019 fill:#fff4de,stroke:#cece71 +f60739d83ceeff1b44a23a6c1be4e92c[alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo] +0ac5645342c7e58f9c227a469d90242e(repo) +0ac5645342c7e58f9c227a469d90242e --> f60739d83ceeff1b44a23a6c1be4e92c +6e82a330ad9fcc12d0ad027136fc3732(result) +f60739d83ceeff1b44a23a6c1be4e92c --> 6e82a330ad9fcc12d0ad027136fc3732 +end +subgraph 49130011bcac425879a677c5486ff0cc[alice.please.log.todos.todos:gh_issue_create_code_of_conduct] +style 49130011bcac425879a677c5486ff0cc fill:#fff4de,stroke:#cece71 +31c8b817615cfd43254dba99ea2586cb[alice.please.log.todos.todos:gh_issue_create_code_of_conduct] +5066ca1af8926ae2c081d71233288d58(body) +5066ca1af8926ae2c081d71233288d58 --> 31c8b817615cfd43254dba99ea2586cb +a429b8b3ec4b6cd90e9c697a3330b012(file_present) +a429b8b3ec4b6cd90e9c697a3330b012 --> 31c8b817615cfd43254dba99ea2586cb +ccd02a25d1ee7e94729a758b676b7050(repo) +ccd02a25d1ee7e94729a758b676b7050 --> 31c8b817615cfd43254dba99ea2586cb +abe38e44e9660841c1abe25ec6ba5ff3(title) +abe38e44e9660841c1abe25ec6ba5ff3 --> 31c8b817615cfd43254dba99ea2586cb +c704cbd635083d06f8d11109ded0597d(issue_url) +31c8b817615cfd43254dba99ea2586cb --> c704cbd635083d06f8d11109ded0597d +end +subgraph 4613afaf00bf0fb8f861ba8a80e664bc[alice.please.log.todos.todos:gh_issue_create_contributing] +style 4613afaf00bf0fb8f861ba8a80e664bc fill:#fff4de,stroke:#cece71 +a243f5b589a38383012170167e99bee9[alice.please.log.todos.todos:gh_issue_create_contributing] +e891bc5f6cc73351082f3f93b486d702(body) +e891bc5f6cc73351082f3f93b486d702 --> a243f5b589a38383012170167e99bee9 +633e21066f9a79ca7a0c580486d1a9e9(file_present) +633e21066f9a79ca7a0c580486d1a9e9 --> a243f5b589a38383012170167e99bee9 +4aaa89e2af6f5c3bc457139808c7cecb(repo) +4aaa89e2af6f5c3bc457139808c7cecb --> a243f5b589a38383012170167e99bee9 +baa9fd440df8cd74a8e3e987077068fd(title) +baa9fd440df8cd74a8e3e987077068fd --> a243f5b589a38383012170167e99bee9 +c672fc455bc58d3fe05f0af332cfb8f2(issue_url) +a243f5b589a38383012170167e99bee9 --> c672fc455bc58d3fe05f0af332cfb8f2 +end +subgraph 7772f7447cabfad14065ddf1ad712a0f[alice.please.log.todos.todos:gh_issue_create_readme] +style 7772f7447cabfad14065ddf1ad712a0f fill:#fff4de,stroke:#cece71 +90c6b15432ca7a4081208f659e5c809b[alice.please.log.todos.todos:gh_issue_create_readme] +df9081024c299071492b0f54df68ee10(body) +df9081024c299071492b0f54df68ee10 --> 90c6b15432ca7a4081208f659e5c809b +a3a402edf5e037041b2cc3714d9a6970(file_present) +a3a402edf5e037041b2cc3714d9a6970 --> 90c6b15432ca7a4081208f659e5c809b +3eabfefcbc7ad816c89a983dcfebb66e(repo) +3eabfefcbc7ad816c89a983dcfebb66e --> 90c6b15432ca7a4081208f659e5c809b +78e47e381d0a2d2aba099b60a43d59b7(title) +78e47e381d0a2d2aba099b60a43d59b7 --> 90c6b15432ca7a4081208f659e5c809b +ab4cc56bd2c79c32bec4c6e1cbdea717(issue_url) +90c6b15432ca7a4081208f659e5c809b --> ab4cc56bd2c79c32bec4c6e1cbdea717 +end +subgraph 259dd82d03b72e83f5594fb70e224c7d[alice.please.log.todos.todos:gh_issue_create_security] +style 259dd82d03b72e83f5594fb70e224c7d fill:#fff4de,stroke:#cece71 +157d90c800047d63c2e9fbc994007c0b[alice.please.log.todos.todos:gh_issue_create_security] +a20e86e85c1ec2f0340182025acfa192(body) +a20e86e85c1ec2f0340182025acfa192 --> 157d90c800047d63c2e9fbc994007c0b +1195a910ea74b27c6eba7a58c13810dc(file_present) +1195a910ea74b27c6eba7a58c13810dc --> 157d90c800047d63c2e9fbc994007c0b +24e86931fc4eb531ba30a1457b5844a2(repo) +24e86931fc4eb531ba30a1457b5844a2 --> 157d90c800047d63c2e9fbc994007c0b +596eedb0a320d0a1549018637df28b39(title) +596eedb0a320d0a1549018637df28b39 --> 157d90c800047d63c2e9fbc994007c0b +106ceb5a00f7f2d8cb56bfea7dd69137(issue_url) +157d90c800047d63c2e9fbc994007c0b --> 106ceb5a00f7f2d8cb56bfea7dd69137 +end +subgraph b8e0594907ccea754b3030ffc4bdc3fc[alice.please.log.todos.todos:gh_issue_create_support] +style b8e0594907ccea754b3030ffc4bdc3fc fill:#fff4de,stroke:#cece71 +6aeac86facce63760e4a81b604cfab0b[alice.please.log.todos.todos:gh_issue_create_support] +18f9a62bdd22ede12d6ea5eac5490ff2(body) +18f9a62bdd22ede12d6ea5eac5490ff2 --> 6aeac86facce63760e4a81b604cfab0b +dace6da55abe2ab1c5c9a0ced2f6833d(file_present) +dace6da55abe2ab1c5c9a0ced2f6833d --> 6aeac86facce63760e4a81b604cfab0b +d2a58f644d7427227cefd56492dfcef9(repo) +d2a58f644d7427227cefd56492dfcef9 --> 6aeac86facce63760e4a81b604cfab0b +9ba4bcdc22dcbab276f68288bfb4d0b1(title) +9ba4bcdc22dcbab276f68288bfb4d0b1 --> 6aeac86facce63760e4a81b604cfab0b +7f2eb20bcd650dc00cde5ca0355b578f(issue_url) +6aeac86facce63760e4a81b604cfab0b --> 7f2eb20bcd650dc00cde5ca0355b578f +end +subgraph cd002409ac60a3eea12f2139f2743c52[alice.please.log.todos.todos:git_repo_to_git_repository_checked_out] +style cd002409ac60a3eea12f2139f2743c52 fill:#fff4de,stroke:#cece71 +e58ba0b1a7efba87321e9493d340767b[alice.please.log.todos.todos:git_repo_to_git_repository_checked_out] +00a9f6e30ea749940657f87ef0a1f7c8(repo) +00a9f6e30ea749940657f87ef0a1f7c8 --> e58ba0b1a7efba87321e9493d340767b +bb1abf628d6e8985c49381642959143b(repo) +e58ba0b1a7efba87321e9493d340767b --> bb1abf628d6e8985c49381642959143b +end +subgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL] +style d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71 +f577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL] +7440e73a8e8f864097f42162b74f2762(URL) +7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40 +8e39b501b41c5d0e4596318f80a03210(valid) +f577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210 +end +subgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo] +style af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71 +155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo] +eed77b9eea541e0c378c67395351099c(URL) +eed77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5 +8b5928cd265dd2c44d67d076f60c8b05(ssh_key) +8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5 +4e1d5ea96e050e46ebf95ebc0713d54c(repo) +155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c +6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL} +6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5 +end +subgraph 98179e1c9444a758d9565431f371b232[dffml_operations_innersource.operations:code_of_conduct_present] +style 98179e1c9444a758d9565431f371b232 fill:#fff4de,stroke:#cece71 +fb772128fdc785ce816c73128e0afd4d[dffml_operations_innersource.operations:code_of_conduct_present] +f333b126c62bdbf832dddf105278d218(repo) +f333b126c62bdbf832dddf105278d218 --> fb772128fdc785ce816c73128e0afd4d +1233aac886e50641252dcad2124003c9(result) +fb772128fdc785ce816c73128e0afd4d --> 1233aac886e50641252dcad2124003c9 +end +subgraph d03657cbeff4a7501071526c5227d605[dffml_operations_innersource.operations:contributing_present] +style d03657cbeff4a7501071526c5227d605 fill:#fff4de,stroke:#cece71 +8da2c8a3eddf27e38838c8b6a2cd4ad1[dffml_operations_innersource.operations:contributing_present] +2a1ae8bcc9add3c42e071d0557e98b1c(repo) +2a1ae8bcc9add3c42e071d0557e98b1c --> 8da2c8a3eddf27e38838c8b6a2cd4ad1 +52544c54f59ff4838d42ba3472b02589(result) +8da2c8a3eddf27e38838c8b6a2cd4ad1 --> 52544c54f59ff4838d42ba3472b02589 +end +subgraph 3ab6f933ff2c5d1c31f5acce50ace507[dffml_operations_innersource.operations:readme_present] +style 3ab6f933ff2c5d1c31f5acce50ace507 fill:#fff4de,stroke:#cece71 +ae6634d141e4d989b0f53fd3b849b101[dffml_operations_innersource.operations:readme_present] +4d289d268d52d6fb5795893363300585(repo) +4d289d268d52d6fb5795893363300585 --> ae6634d141e4d989b0f53fd3b849b101 +65fd35d17d8a7e96c9f7e6aaedb75e3c(result) +ae6634d141e4d989b0f53fd3b849b101 --> 65fd35d17d8a7e96c9f7e6aaedb75e3c +end +subgraph da39b149b9fed20f273450b47a0b65f4[dffml_operations_innersource.operations:security_present] +style da39b149b9fed20f273450b47a0b65f4 fill:#fff4de,stroke:#cece71 +c8921544f4665e73080cb487aef7de94[dffml_operations_innersource.operations:security_present] +e682bbcfad20caaab15e4220c81e9239(repo) +e682bbcfad20caaab15e4220c81e9239 --> c8921544f4665e73080cb487aef7de94 +5d69c4e5b3601abbd692ade806dcdf5f(result) +c8921544f4665e73080cb487aef7de94 --> 5d69c4e5b3601abbd692ade806dcdf5f +end +subgraph 062b8882104862540d584516edc60008[dffml_operations_innersource.operations:support_present] +style 062b8882104862540d584516edc60008 fill:#fff4de,stroke:#cece71 +5cc75c20aee40e815abf96726508b66d[dffml_operations_innersource.operations:support_present] +f0e4cd91ca4f6b278478180a188a2f5f(repo) +f0e4cd91ca4f6b278478180a188a2f5f --> 5cc75c20aee40e815abf96726508b66d +46bd597a57e034f669df18ac9ae0a153(result) +5cc75c20aee40e815abf96726508b66d --> 46bd597a57e034f669df18ac9ae0a153 +end +subgraph 55a339b2b9140e7d9c3448e706288e6e[operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url] +style 55a339b2b9140e7d9c3448e706288e6e fill:#fff4de,stroke:#cece71 +e90587117185b90364bd54700bfd4e3b[operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url] +725810a22f04a3ff620021588233815f(repo_id) +725810a22f04a3ff620021588233815f --> e90587117185b90364bd54700bfd4e3b +d2ee13433e404b6ef59d0f0344e28e2f(result) +e90587117185b90364bd54700bfd4e3b --> d2ee13433e404b6ef59d0f0344e28e2f +end +end +subgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage] +style a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a +end +subgraph 58ca4d24d2767176f196436c2890b926[Output Stage] +style 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a +end +subgraph inputs[Inputs] +style inputs fill:#f6dbf9,stroke:#a178ca +6e82a330ad9fcc12d0ad027136fc3732 --> 5c7743e872c165030dcf051c712106fc +8d32e3f614b2c8f9d23e7469eaa1da12 --> 77a8695545cb64a7becb9f50343594c3 +128516cfa09b0383023eab52ee24878a(seed
dffml.util.cli.CMD) +128516cfa09b0383023eab52ee24878a --> 7ba42765e6fba6206fd3d0d7906f6bf3 +128516cfa09b0383023eab52ee24878a(seed
dffml.util.cli.CMD) +128516cfa09b0383023eab52ee24878a --> 2a071a453a1e677a127cee9775d0fd9f +904eb6737636f1d32a6d890f449e9081 --> 0ac5645342c7e58f9c227a469d90242e +f6bfde5eece6eb52bb4b4a3dbc945d9f --> 0ac5645342c7e58f9c227a469d90242e +25d4e646671f80ac105f05de50445ba5(seed
CodeOfConductIssueBody) +25d4e646671f80ac105f05de50445ba5 --> 5066ca1af8926ae2c081d71233288d58 +1233aac886e50641252dcad2124003c9 --> a429b8b3ec4b6cd90e9c697a3330b012 +bb1abf628d6e8985c49381642959143b --> ccd02a25d1ee7e94729a758b676b7050 +44ec56a4fd4b5eea9c8523dcb881d2d1(seed
CodeOfConductIssueTitle) +44ec56a4fd4b5eea9c8523dcb881d2d1 --> abe38e44e9660841c1abe25ec6ba5ff3 +c94383981c3a071b8c3df7293c8c7c92(seed
ContributingIssueBody) +c94383981c3a071b8c3df7293c8c7c92 --> e891bc5f6cc73351082f3f93b486d702 +52544c54f59ff4838d42ba3472b02589 --> 633e21066f9a79ca7a0c580486d1a9e9 +bb1abf628d6e8985c49381642959143b --> 4aaa89e2af6f5c3bc457139808c7cecb +90c6a88275f27b28dc12f5741ac1652f(seed
ContributingIssueTitle) +90c6a88275f27b28dc12f5741ac1652f --> baa9fd440df8cd74a8e3e987077068fd +1daacccd02f8117e67ad3cb8686a732c(seed
ReadmeIssueBody) +1daacccd02f8117e67ad3cb8686a732c --> df9081024c299071492b0f54df68ee10 +65fd35d17d8a7e96c9f7e6aaedb75e3c --> a3a402edf5e037041b2cc3714d9a6970 +bb1abf628d6e8985c49381642959143b --> 3eabfefcbc7ad816c89a983dcfebb66e +0c1ab2d4bda10e1083557833ae5c5da4(seed
ReadmeIssueTitle) +0c1ab2d4bda10e1083557833ae5c5da4 --> 78e47e381d0a2d2aba099b60a43d59b7 +b076a6070cf7626bccd630198450637c(seed
SecurityIssueBody) +b076a6070cf7626bccd630198450637c --> a20e86e85c1ec2f0340182025acfa192 +5d69c4e5b3601abbd692ade806dcdf5f --> 1195a910ea74b27c6eba7a58c13810dc +bb1abf628d6e8985c49381642959143b --> 24e86931fc4eb531ba30a1457b5844a2 +d734943b101c6e465df8c4cabe9b872e(seed
SecurityIssueTitle) +d734943b101c6e465df8c4cabe9b872e --> 596eedb0a320d0a1549018637df28b39 +a7f3a4f2059bb4b3c170322febb4e93f(seed
SupportIssueBody) +a7f3a4f2059bb4b3c170322febb4e93f --> 18f9a62bdd22ede12d6ea5eac5490ff2 +46bd597a57e034f669df18ac9ae0a153 --> dace6da55abe2ab1c5c9a0ced2f6833d +bb1abf628d6e8985c49381642959143b --> d2a58f644d7427227cefd56492dfcef9 +2ae304b14108a13de9dfa57f1e77cc2f(seed
SupportIssueTitle) +2ae304b14108a13de9dfa57f1e77cc2f --> 9ba4bcdc22dcbab276f68288bfb4d0b1 +4e1d5ea96e050e46ebf95ebc0713d54c --> 00a9f6e30ea749940657f87ef0a1f7c8 +d259a05785074877b9509ed686e03b3a --> 7440e73a8e8f864097f42162b74f2762 +d2ee13433e404b6ef59d0f0344e28e2f --> 7440e73a8e8f864097f42162b74f2762 +d259a05785074877b9509ed686e03b3a --> eed77b9eea541e0c378c67395351099c +d2ee13433e404b6ef59d0f0344e28e2f --> eed77b9eea541e0c378c67395351099c +a6ed501edbf561fda49a0a0a3ca310f0(seed
git_repo_ssh_key) +a6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05 +8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce +bb1abf628d6e8985c49381642959143b --> f333b126c62bdbf832dddf105278d218 +bb1abf628d6e8985c49381642959143b --> 2a1ae8bcc9add3c42e071d0557e98b1c +bb1abf628d6e8985c49381642959143b --> 4d289d268d52d6fb5795893363300585 +bb1abf628d6e8985c49381642959143b --> e682bbcfad20caaab15e4220c81e9239 +bb1abf628d6e8985c49381642959143b --> f0e4cd91ca4f6b278478180a188a2f5f +090b151d70cc5b37562b42c64cb16bb0(seed
GitHubRepoID) +090b151d70cc5b37562b42c64cb16bb0 --> 725810a22f04a3ff620021588233815f +end +``` + +- The flow looks fine the way it's wired in the above mermaid diagram + - Guessing it's an issue with `subflow` and the multi-context `run()`. + - HEAD: f61bd161aa738ede314723b6bbb9667449abdd67 + +```console +$ alice please log todos -log debug -keys https://github.com/pdxjohnny/testaaa +$ for repo_url in $(echo https://github.com/pdxjohnny/testaaa); do gh issue list --search "Recommended Community Standard:" -R "${repo_url}" | grep -v '2022-11-05'; done +59 OPEN Recommended Community Standard: SUPPORT 2022-11-17 17:05:08 +0000 UTC +58 OPEN Recommended Community Standard: SECURITY 2022-11-17 17:05:06 +0000 UTC +57 OPEN Recommended Community Standard: README 2022-11-17 17:05:05 +0000 UTC +56 OPEN Recommended Community Standard: CONTRIBUTING 2022-11-17 17:05:04 +0000 UTC +6 OPEN Recommended Community Standard: SUPPORT 2022-11-04 06:33:26 +0000 UTC +5 OPEN Recommended Community Standard: SUPPORT 2022-11-04 06:28:41 +0000 UTC +4 OPEN Recommended Community Standard: SUPPORT 2022-11-04 06:27:42 +0000 UTC +55 OPEN Recommended Community Standard: CODE_OF_CONDUCT 2022-11-17 17:05:02 +0000 UTC +1 OPEN Recommended Community Standard: README 2022-06-25 01:12:18 +0000 UTC +2 OPEN Recommended Community Standards 2022-06-25 01:12:20 +0000 UTC +``` + +- Unclear what's up, going to send and just close duplicates + +```console +$ grep Stage:\ PROCESSING .output.2022-11-16T20:49:13+00:00.txt +DEBUG:dffml.MemoryOperationImplementationNetworkContext:operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url Stage: PROCESSING: operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos.OverlayCLI:cli_has_repos Stage: PROCESSING: alice.please.log.todos.todos.OverlayCLI:cli_has_repos +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo Stage: PROCESSING: alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo Stage: PROCESSING: alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url Stage: PROCESSING: alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url +DEBUG:dffml.MemoryOperationImplementationNetworkContext:operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url Stage: PROCESSING: operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url +DEBUG:dffml.MemoryOperationImplementationNetworkContext:check_if_valid_git_repository_URL Stage: PROCESSING: check_if_valid_git_repository_URL +DEBUG:dffml.MemoryOperationImplementationNetworkContext:check_if_valid_git_repository_URL Stage: PROCESSING: check_if_valid_git_repository_URL +DEBUG:dffml.MemoryOperationImplementationNetworkContext:clone_git_repo Stage: PROCESSING: clone_git_repo +DEBUG:dffml.MemoryOperationImplementationNetworkContext:clone_git_repo Stage: PROCESSING: clone_git_repo +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:git_repo_to_git_repository_checked_out Stage: PROCESSING: alice.please.log.todos.todos:git_repo_to_git_repository_checked_out +DEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:code_of_conduct_present Stage: PROCESSING: dffml_operations_innersource.operations:code_of_conduct_present +DEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:contributing_present Stage: PROCESSING: dffml_operations_innersource.operations:contributing_present +DEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:readme_present Stage: PROCESSING: dffml_operations_innersource.operations:readme_present +DEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:security_present Stage: PROCESSING: dffml_operations_innersource.operations:security_present +DEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:support_present Stage: PROCESSING: dffml_operations_innersource.operations:support_present +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_code_of_conduct Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_code_of_conduct +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:git_repo_to_git_repository_checked_out Stage: PROCESSING: alice.please.log.todos.todos:git_repo_to_git_repository_checked_out +DEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:code_of_conduct_present Stage: PROCESSING: dffml_operations_innersource.operations:code_of_conduct_present +DEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:contributing_present Stage: PROCESSING: dffml_operations_innersource.operations:contributing_present +DEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:readme_present Stage: PROCESSING: dffml_operations_innersource.operations:readme_present +DEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:security_present Stage: PROCESSING: dffml_operations_innersource.operations:security_present +DEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:support_present Stage: PROCESSING: dffml_operations_innersource.operations:support_present +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_code_of_conduct Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_code_of_conduct +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_contributing Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_contributing +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_contributing Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_contributing +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_readme Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_readme +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_security Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_security +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_readme Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_readme +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_security Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_security +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_support Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_support +DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_support Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_support +$ do alice please log todos -log debug -record-def GitHubRepoID -keys "${github_repo_id}" 2>&1 | tee .output.$(date -Iseconds).txt +``` + +- https://github.com/decentralized-identity/credential-manifest/issues/125#issuecomment-1310728595 + - No movement on this yet + - Checked for other signs of life in [kimdhamilton](https://github.com/kimdhamilton)'s trains of thought (aka recent activity on GitHub) + - https://github.com/centrehq/verite + - https://verite.id/verite + - Ding ding ding! +- TODO + - [x] Partial left handed mouse day + - Back left base of neck headache? Related? + - Butterfly keyboard for even a few minutes has made me nauseous, not sure if related. + - [ ] Review https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#getting-started-with-oidc + - [ ] Perhaps reuse if license allows within OpenSSF metrics doc if it would help, unknown haven't read yet. + - [ ] Prototype infra docs as YAML as overlay with SaaSBOM or OBOM or whatever it was that's applicable + - [ ] Review ideas for dev automation dataflows https://github.com/pdxjohnny/pdxjohnny.github.io/commit/328aee6351d3d12f72abe93b5be0bcacea64c3ef and update Alice docs accordingly + - [ ] Sync opened tabs synced to shell context active synced to engineering logs + - https://developer.chrome.com/docs/extensions/reference/tabs/ + - https://github.com/pdxjohnny/pdxjohnny.github.io/blob/abfa83255d77eaaf35f92593828ba7a6a7001fb3/content/posts/dev-environment.md?plain=1#L116-L119 + - [ ] Debug double issue creation + - [ ] Log `GraphQL: was submitted too quickly (createIssue)` issues, deal with? Add retry? + - [ ] Get back to Elsa with learning methodologies similarity thing, grep? + - [ ] Document two then devs working together + - See poly repo pull model CR0/4 example (which also talked to Kees about yesterday at meetup) https://github.com/intel/dffml/issues/1315#issuecomment-1066971630 + - [ ] Start Vol 4 with whatever was in the notes about it recently, can't remember right now + - [x] Matt nodded in relation to SCITT + - [x] Marc might pursure matrix manifest approach for Zephyr build to test handoff + - [x] Several conversations about CD and manifests + - Mentioned #1061 + - Forgot to mention and there is something related to #1207... + - [ ] NVDStyle as first stab at stream of consciousness to find vuln via cve-bin-tool (mock output if need be to "find" vuln) + - [ ] Trigger rebuild of wheel and push to GitHub releases + - [ ] `alice please contribute cicd` to run templating on the GitHub Actions, + `workflow_dispatch` style (that calls reusable). + - [ ] Do DevCloud demo + - https://github.com/intel/dffml/issues/1247 + - Spin DevCloud deploy GitHub Actions Runner and hermetic build 🤙 with manifests and SCITT receipts the DFFML main package + - `DevCloudOrchestrator`? \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0090/index.md b/docs/discussions/alice_engineering_comms/0090/index.md new file mode 100644 index 0000000000..4c62dd4161 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0090/index.md @@ -0,0 +1 @@ +# 2022-11-18 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0090/reply_0000.md b/docs/discussions/alice_engineering_comms/0090/reply_0000.md new file mode 100644 index 0000000000..524307c15d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0090/reply_0000.md @@ -0,0 +1,274 @@ +## 2022-11-18 @pdxjohnny Engineering Logs + +- https://social-embed.git-pull.com/docs/wc/ + - This looks interesting + - https://oembed.com/ + - > oEmbed is a format for allowing an embedded representation of a URL on third party sites. The simple API allows a website to display embedded content (such as photos or videos) when a user posts a link to that resource, without having to parse the resource directly. +- https://ocaml.org + - Used for Linux kernel semantic patches +- https://github.com/cue-lang/cue + - Need to play with Cue language +- GitHub Actions templates docs + - [Reusable workflows]() are identified by the presence of [`on.workflow_call`](https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#onworkflow_call) an example of a reusable workflow for container builds following the [manifest](https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md) pattern can be found ​in the [`*build_images_containers.yml` files](https://github.com/intel/dffml/blob/main/.github/workflows/build_images_containers.yml). +- GitHub Action runner support SCITT receipts on containers / actions +- `podman` support SCITT recpits +- https://ariadne.space/2019/07/13/federation-what-flows-where-and-why/ + - > most of the risks described here are mitigated by telling mastodon to use authorized fetch mode. please turn authorized fetch mode on, for your own good. +- https://hacker.solar/books/about-this-site/page/what-is-hacker-solar +- https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093 + - https://social.treehouse.systems/@ariadne/109365116698192103 + - We are going to try to hybridize the authroized fetch mode with SCITT receipts and then bridge that into web5 + - Also touched on recent OIDC verification via notary +- Need to remove time from tmux for idle time to work so that it doesn't tick every second and make giant files when there is no new output other than the time + - https://github.com/git-pull/tao-of-tmux/blob/master/manuscript/10-scripting.md#formats-formats + +```console +$ nodemon -e py --exec 'clear; nvd_api_key=$NVD_API_KEY LONG_TESTS=1 timeout 10s python3.10 -um coverage run -m pytest -v --log-level=DEBUG --log-cli-level=DEBUG test/test_nvd_api.py::TestNVD_API::test_total_results_count -k 2.0; test 1' +... +___________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________ TestNVD_API.test_total_results_count[2.0-feed1-stats1] ____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________ + +self = , api_version = '2.0', feed = , stats = +... +> assert len(nvd_api.all_cve_entries) >= nvd_api.total_results +E assert 0 >= 10 +... +test/test_nvd_api.py:88: AssertionError +--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Captured log setup ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- +DEBUG asyncio:selector_events.py:54 Using selector: EpollSelector +DEBUG asyncio:selector_events.py:54 Using selector: EpollSelector +-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Captured stdout call --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- +Fetching incremental metadata from NVD... ━━━━━━━━━━━━━━━━━━━━━━━━━ 0% -:--:-- +Downloading Feeds from NVD... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100% 0:00:00 +-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Captured stderr call --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- +127.0.0.1 - - [18/Nov/2022 08:38:09] "GET /?reporttype=countsbystatus HTTP/1.1" 200 - +127.0.0.1 - - [18/Nov/2022 08:38:09] "GET /2.0?startIndex=0&resultsPerPage=1 HTTP/1.1" 200 - +127.0.0.1 - - [18/Nov/2022 08:38:09] "GET /2.0?startIndex=0&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902 HTTP/1.1" 200 - +127.0.0.1 - - [18/Nov/2022 08:38:12] "GET /2.0?startIndex=0&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902 HTTP/1.1" 200 - +127.0.0.1 - - [18/Nov/2022 08:38:12] "GET /2.0?startIndex=2000&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902 HTTP/1.1" 200 - +---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Captured log call ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- +INFO cve_bin_tool.NVD_API:nvd_api.py:135 Fetching metadata from NVD... +DEBUG alice.emulate.nvd.api:nvdstyle.py:158 ParseResult(scheme='', netloc='', path='/', params='', query='reporttype=countsbystatus', fragment='') +DEBUG alice.emulate.nvd.api:nvdstyle.py:163 {'reporttype': ['countsbystatus']} +DEBUG alice.emulate.nvd.api:nvdstyle.py:172 Serving stats... +INFO cve_bin_tool.NVD_API:nvd_api.py:137 Got metadata from NVD: {'Total': 10, 'Rejected': 0, 'Received': 0, 'Modified': 0, 'Undergoing Analysis': 0, 'Awaiting Analysis': 0} +INFO cve_bin_tool.NVD_API:nvd_api.py:140 self.total_results = Total: 10 - Rejected: 0 +INFO cve_bin_tool.NVD_API:nvd_api.py:144 Valiating NVD api... +DEBUG alice.emulate.nvd.api:nvdstyle.py:158 ParseResult(scheme='', netloc='', path='/2.0', params='', query='startIndex=0&resultsPerPage=1', fragment='') +DEBUG alice.emulate.nvd.api:nvdstyle.py:163 {'startIndex': ['0'], 'resultsPerPage': ['1']} +DEBUG alice.emulate.nvd.api:nvdstyle.py:240 Serving validate NVD API: start_index: 0 results_per_page: 1... +DEBUG alice.emulate.nvd.api:nvdstyle.py:274 Serving validate: results: {'format': 'NVD_CVE', 'resultsPerPage': 1, 'startIndex': 0, 'timestamp': '2022-11-18T08:38Z', 'totalResults': 10, 'version': '2.0', 'vulnerabilities': [{'cve': {'configurations': [{'nodes': [{'cpeMatch': [{'criteria': 'cpe:2.3:a:eric_allman:sendmail:5.58:*:*:*:*:*:*:*', 'matchCriteriaId': '1D07F493-9C8D-44A4-8652-F28B46CBA27C', 'vulnerable': True}], 'negate': False, 'operator': 'OR'}]}], 'descriptions': [{'lang': 'en', 'value': 'The debug command in Sendmail is enabled, allowing attackers to execute commands as root.'}, {'lang': 'es', 'value': 'El comando de depuración de Sendmail está activado, permitiendo a atacantes ejecutar comandos como root.'}], 'id': 'CVE-1999-0095', 'lastModified': '2019-06-11T20:29:00.263', 'metrics': {'cvssMetricV2': [{'acInsufInfo': False, 'cvssData': {'accessComplexity': 'LOW', 'accessVector': 'NETWORK', 'authentication': 'NONE', 'availabilityImpact': 'COMPLETE', 'baseScore': 10.0, 'baseSeverity': 'HIGH', 'confidentialityImpact': 'COMPLETE', 'integrityImpact': 'COMPLETE', 'vectorString': 'AV:N/AC:L/Au:N/C:C/I:C/A:C', 'version': '2.0'}, 'exploitabilityScore': 10.0, 'impactScore': 10.0, 'obtainAllPrivilege': True, 'obtainOtherPrivilege': False, 'obtainUserPrivilege': False, 'source': 'nvd@nist.gov', 'type': 'Primary', 'userInteractionRequired': False}]}, 'published': '1988-10-01T04:00:00.000', 'references': [{'source': 'cve@mitre.org', 'url': 'http://seclists.org/fulldisclosure/2019/Jun/16'}, {'source': 'cve@mitre.org', 'url': 'http://www.openwall.com/lists/oss-security/2019/06/05/4'}, {'source': 'cve@mitre.org', 'url': 'http://www.openwall.com/lists/oss-security/2019/06/06/1'}, {'source': 'cve@mitre.org', 'url': 'http://www.securityfocus.com/bid/1'}], 'sourceIdentifier': 'cve@mitre.org', 'vulnStatus': 'Modified', 'weaknesses': [{'description': [{'lang': 'en', 'value': 'NVD-CWE-Other'}], 'source': 'nvd@nist.gov', 'type': 'Primary'}]}}]} +INFO cve_bin_tool.NVD_API:nvd_api.py:146 Valiated NVD api +INFO cve_bin_tool.NVD_API:nvd_api.py:175 Fetching updated CVE entries after 2022-11-16T16:36:09:895 +DEBUG alice.emulate.nvd.api:nvdstyle.py:158 ParseResult(scheme='', netloc='', path='/2.0', params='', query='startIndex=0&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902', fragment='') +DEBUG alice.emulate.nvd.api:nvdstyle.py:163 {'startIndex': ['0'], 'resultsPerPage': ['2000'], 'lastModStartDate': ['2022-11-16T16:36:09:895'], 'lastModEndDate': ['2022-11-18T16:38:09:902']} +DEBUG alice.emulate.nvd.api:nvdstyle.py:284 Serving feed: start_index: 0 results_per_page: 2000... +DEBUG alice.emulate.nvd.api:nvdstyle.py:336 Serving feed with 10 results +INFO cve_bin_tool.NVD_API:nvd_api.py:189 Adding 10 CVE entries +DEBUG alice.emulate.nvd.api:nvdstyle.py:158 ParseResult(scheme='', netloc='', path='/2.0', params='', query='startIndex=0&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902', fragment='') +DEBUG alice.emulate.nvd.api:nvdstyle.py:158 ParseResult(scheme='', netloc='', path='/2.0', params='', query='startIndex=2000&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902', fragment='') +DEBUG alice.emulate.nvd.api:nvdstyle.py:163 {'startIndex': ['0'], 'resultsPerPage': ['2000'], 'lastModStartDate': ['2022-11-16T16:36:09:895'], 'lastModEndDate': ['2022-11-18T16:38:09:902']} +DEBUG alice.emulate.nvd.api:nvdstyle.py:163 {'startIndex': ['2000'], 'resultsPerPage': ['2000'], 'lastModStartDate': ['2022-11-16T16:36:09:895'], 'lastModEndDate': ['2022-11-18T16:38:09:902']} +DEBUG alice.emulate.nvd.api:nvdstyle.py:284 Serving feed: start_index: 0 results_per_page: 2000... +DEBUG alice.emulate.nvd.api:nvdstyle.py:284 Serving feed: start_index: 2000 results_per_page: 2000... +DEBUG alice.emulate.nvd.api:nvdstyle.py:336 Serving feed with 10 results +DEBUG alice.emulate.nvd.api:nvdstyle.py:336 Serving feed with 0 results +-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Captured log teardown -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- +DEBUG asyncio:selector_events.py:54 Using selector: EpollSelector +=================================================================================================================================================================================================================================================================================================================== short test summary info ============================================================================================================================================================== - +===================================================================================================================================================== +FAILED test/test_nvd_api.py::TestNVD_API::test_total_results_count[2.0-feed1-stats1] - assert 0 >= 10 +=============================================================================================================================================================================================================================================================================================================== 1 failed, 1 deselected in 6.51s =============================================================================================================================================================================================================================================================================================================== +[nodemon] clean exit - waiting for changes before restart +``` + +- Ah ha! Enabled debug logging because noticed we weren't seeing the + "Send Request" log client side. + +```diff +diff --git a/cve_bin_tool/log.py b/cve_bin_tool/log.py +index 85b7009..749b867 100644 +--- a/cve_bin_tool/log.py ++++ b/cve_bin_tool/log.py +@@ -30,4 +30,4 @@ logging.basicConfig( + root_logger = logging.getLogger() + + LOGGER = logging.getLogger(__package__) +-LOGGER.setLevel(logging.INFO) ++LOGGER.setLevel(logging.DEBUG) +diff --git a/cve_bin_tool/nvd_api.py b/cve_bin_tool/nvd_api.py +index 28bc102..0f82748 100644 +--- a/cve_bin_tool/nvd_api.py ++++ b/cve_bin_tool/nvd_api.py +@@ -130,14 +130,20 @@ class NVD_API: + + if not self.session: + connector = aiohttp.TCPConnector(limit_per_host=19) +- self.session = RateLimiter( +- aiohttp.ClientSession(connector=connector, trust_env=True) +- ) ++ self.session = aiohttp.ClientSession(connector=connector, trust_env=True) + + self.logger.info("Fetching metadata from NVD...") + cve_count = await self.nvd_count_metadata(self.session, self.stats) ++ self.logger.info("Got metadata from NVD: %r", cve_count) ++ ++ self.total_results = cve_count["Total"] - cve_count["Rejected"] ++ self.logger.info( ++ f'self.total_results = Total: {cve_count["Total"]} - Rejected: {cve_count["Rejected"]}' ++ ) + ++ self.logger.info("Valiating NVD api...") + await self.validate_nvd_api() ++ self.logger.info("Valiated NVD api") + + if self.invalid_api: + self.logger.warning( +@@ -180,8 +186,6 @@ class NVD_API: + progress.update(task) + progress.update(task, advance=1) + +- else: +- self.total_results = cve_count["Total"] - cve_count["Rejected"] + self.logger.info(f"Adding {self.total_results} CVE entries") + + async def validate_nvd_api(self): +@@ -227,7 +231,6 @@ class NVD_API: + self.logger.debug(f"Response received {response.status}") + if response.status == 200: + fetched_data = await response.json() +- + if start_index == 0: + # Update total results in case there is discrepancy between NVD dashboard and API + reject_count = ( +@@ -238,6 +241,9 @@ class NVD_API: + self.total_results = ( + fetched_data["totalResults"] - reject_count + ) ++ self.logger.info( ++ f'self.total_results = Total: {fetched_data["totalResults"]} - Rejected: {reject_count}' ++ ) + if self.api_version == "1.0": + self.all_cve_entries.extend( + fetched_data["result"]["CVE_Items"] +diff --git a/test/test_nvd_api.py b/test/test_nvd_api.py +index 91cf1fb..e7e2a96 100644 +--- a/test/test_nvd_api.py ++++ b/test/test_nvd_api.py +@@ -2,16 +2,26 @@ + # SPDX-License-Identifier: GPL-3.0-or-later + + import os ++import types + import shutil + import tempfile ++import contextlib + from datetime import datetime, timedelta + from test.utils import LONG_TESTS + + import pytest ++import aiohttp ++import httptest ++ ++import alice.threats.vulns.serve.nvdstyle + + from cve_bin_tool.cvedb import CVEDB + from cve_bin_tool.data_sources import nvd_source +-from cve_bin_tool.nvd_api import NVD_API ++from cve_bin_tool.nvd_api import ( ++ NVD_API, ++ FEED as NVD_API_FEED, ++ NVD_CVE_STATUS, ++) + + + class TestNVD_API: +@@ -42,14 +52,40 @@ class TestNVD_API: + LONG_TESTS() != 1 or not os.getenv("nvd_api_key"), + reason="NVD tests run only in long tests", + ) +- async def test_total_results_count(self): ++ @pytest.mark.parametrize( ++ "api_version, feed, stats", ++ [ ++ ( ++ "1.0", ++ httptest.Server(alice.threats.vulns.serve.nvdstyle.NVDStyleHTTPHandler), ++ httptest.Server(alice.threats.vulns.serve.nvdstyle.NVDStyleHTTPHandler), ++ ), ++ ( ++ "2.0", ++ httptest.Server(alice.threats.vulns.serve.nvdstyle.NVDStyleHTTPHandler), ++ httptest.Server(alice.threats.vulns.serve.nvdstyle.NVDStyleHTTPHandler), ++ ), ++ ], ++ ) ++ async def test_total_results_count(self, api_version, feed, stats): + """Total results should be greater than or equal to the current fetched cves""" +- nvd_api = NVD_API(api_key=os.getenv("nvd_api_key") or "") +- await nvd_api.get_nvd_params( +- time_of_last_update=datetime.now() - timedelta(days=2) +- ) +- await nvd_api.get() +- assert len(nvd_api.all_cve_entries) >= nvd_api.total_results ++ # TODO alice.nvd.TestHTTPServer will become either ++ # alice.nvd.TestNVDVersion_1_0 or alice.nvd.TestNVDVersion_2_0 ++ # lambda *args: alice.nvd.TestHTTPServer(*args, directory=pathlib.Path(__file__).parent) ++ with feed as feed_http_server, stats as stats_http_server: ++ async with aiohttp.ClientSession() as session: ++ nvd_api = NVD_API( ++ feed=feed_http_server.url(), ++ stats=stats_http_server.url(), ++ api_key=os.getenv("nvd_api_key") or "", ++ session=session, ++ api_version=api_version, ++ ) ++ await nvd_api.get_nvd_params( ++ time_of_last_update=datetime.now() - timedelta(days=2) ++ ) ++ await nvd_api.get() ++ assert len(nvd_api.all_cve_entries) >= nvd_api.total_results + + @pytest.mark.asyncio- + + @pytest.mark.skipif( +``` + +- Enabling debug logging resulted in the following statement being logged. + - This failure should probably be an `ERROR` level rather than `DEBUG` log. + +``` +DEBUG cve_bin_tool.NVD_API:nvd_api.py:274 Failed to connect to NVD list indices must be integers or slices, not str +``` + +- Added traceback +- Is NVD2 code needing to index? `fetched_data["vulnerabilities"][index]["cve"]`? + +``` + +ERROR cve_bin_tool.NVD_API:nvd_api.py:276 Pausing requests for 3 seconds +DEBUG cve_bin_tool.NVD_API:nvd_api.py:277 TypeError('list indices must be integers or slices, not str') +Traceback (most recent call last): + File "/home/pdxjohnny/Documents/python/cve-bin-tool/cve_bin_tool/nvd_api.py", line 254, in load_nvd_request + fetched_data["vulnerabilities"]["cve"] +TypeError: list indices must be integers or slices, not str +``` + +- Found and fixed two issues + - intel/cve-bin-tool@afc4a9254683d2a7027bc6574e99d1b0d406d5bc + - fix(nvd_api): Align v2 rejection handling with description schema updates + - intel/cve-bin-tool@46cd825b126dd167158cae4f5e4ac7a32de2e08d + - fix(nvd_api): extend all cve entries from v2 query top level vulnerabilities key + +[![asciicast](https://asciinema.org/a/538712.svg)](https://asciinema.org/a/538712) + +- Pushed 9f0a41ad55bdc7f295c435ebd51db77e3343b915 + - alice: threats: vulns: serve: nvdstyle: Fix serving of v2 style CVEs +- Liquid Time-constant Networks Adaptive Online Networks + - https://arxiv.org/pdf/2006.04439v1.pdf +- TODO + - [ ] Finish scorecard demo and intergate into shouldi + - Put this in down the dependency rabbit hole again as one of the things we put in `THREATS.md` + - [ ] `alice threats cicd` (`-keys https://github.com/intel/dffml`) + - [ ] GitHub Actions workflow analysis overlays + - [ ] Look for `runs-on:` and anything not GitHub hosted, then + check `on:` triggers to ensure pull requests aren't being run. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0090/reply_0001.md b/docs/discussions/alice_engineering_comms/0090/reply_0001.md new file mode 100644 index 0000000000..582e07f87a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0090/reply_0001.md @@ -0,0 +1,19 @@ +## Overlays as Dynamic Context Aware Branches + +> Todo more fanciful tutorial name + +At a minimum it's like saying when I checkout this branch I want you to cherry pick these commits (semanticly?) from these other branches (and run A/B cross validation of course) and make that a sort of virtual branch where those commits are applied and still tracked as dev or in flight or just alternately sourced versions. + +- References + - https://github.com/intel/dffml/issues/1315#issuecomment-1066971630 + - Alice and Bob working on CR0/4 + - Examples of virtual branches + - Turning on debug logging while working on NVD style API for use by + cve-bin-tool (and Alice of course). + - [2022-11-18 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4177910) +- TODO + - Knowledge graph of manifests with SCITT receipts + - Stream of Consciousness + - We share test results of cross validation and virtual branch node additions here + - Alice, Bob, and Eve working with three separate repos + - Cross validation comes into play here \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0091/index.md b/docs/discussions/alice_engineering_comms/0091/index.md new file mode 100644 index 0000000000..450ac1ba3b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0091/index.md @@ -0,0 +1 @@ +# 2022-11-19 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0091/reply_0000.md b/docs/discussions/alice_engineering_comms/0091/reply_0000.md new file mode 100644 index 0000000000..96e6fff6e1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0091/reply_0000.md @@ -0,0 +1,5 @@ +## 2022-11-19 @pdxjohnny Engineering Logs + +- https://github.com/oras-project/oras-py + - Put it all in the container registry +- https://github.com/OpenChain-Project/Reference-Material/blob/master/Self-Certification/Checklist/Security-Assurance-1.1/en/Security-Assurance-1-1-Checklist-Version-2.md \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0092/index.md b/docs/discussions/alice_engineering_comms/0092/index.md new file mode 100644 index 0000000000..d23aada4a9 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0092/index.md @@ -0,0 +1 @@ +# 2022-11-20 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0093/index.md b/docs/discussions/alice_engineering_comms/0093/index.md new file mode 100644 index 0000000000..ed4c397bc5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0093/index.md @@ -0,0 +1 @@ +# 2022-11-21 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0093/reply_0000.md b/docs/discussions/alice_engineering_comms/0093/reply_0000.md new file mode 100644 index 0000000000..0dad087e1f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0093/reply_0000.md @@ -0,0 +1,93 @@ +## 2022-11-21 @pdxjohnny Engineering Logs + +- https://github.com/CrunchyData/pg_eventserv + - `FROM` rebuild chain pdxjohnny/dffml-operations-dockerhub@a738c35199afe82d8a35d97ce16711c6f19785c5 +- Going through old repos to look for logcat server + - Found a bunch of code I forgot I wrote and is referenced in Alice thread as deps + - https://github.com/pdxjohnny/webrtcvpn + - https://github.com/pdxjohnny/diffstream + - https://github.com/pdxjohnny/telem/blob/8676810086c732e1a738ce58a6296993f7a87661/client/c/encrypt.c + - https://github.com/pdxjohnny/hack + - Looks like this packs shellcode for `exec` system calls on linux + - [![hack-the-planet](https://img.shields.io/badge/hack%20the-planet-blue)](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md#hack-the-planet-) + - Ref shim + - https://github.com/pdxjohnny/freeze-tool/tree/master/logger + - Stream logging / cross this with telemetry one + - https://github.com/pdxjohnny/video_chat/blob/master/image_video.js#L95 + - This comes in handy with VNC over PNG/JPEG etc. when massive hax are required + - https://github.com/pdxjohnny/pysync/blob/master/sync.py + - :grimacing: (cve-bin-tool vlcn-io/cr-sqlite bellow in TODO, been at this a while too) +- https://github.com/oras-project/oras-py + - https://github.com/opencontainers/distribution-spec + - Inventory? + - https://github.com/opencontainers/distribution-spec/blob/main/spec.md#enabling-the-referrers-api + - https://github.com/intel/dffml/pull/1207#discussion_r1026981623 + - Stream of Consciousness? + - Might already have websub or equivalent, implementation / ratification status was unclear, dig more investigate Open Architecture encoded (autocodec, multiformat, shim, custom basic, unencoded json, etc.) callback enabling. + - OCI distribution spec all the things + - Python packages + - SBOM + - VEX + - SCITT + +![OCI distribution spec all the things meme](https://user-images.githubusercontent.com/5950433/203143783-b7f9e731-80bd-42c7-b97d-410d62676758.png) + +- Last Friday pushed alice: threats: vulns: serve: nvdstyle: Fix serving of v2 style CVEs - 9f0a41ad55bdc7f295c435ebd51db77e3343b915 + - We can now start serving threats! + - Need to finish out the contribution to CVE Binary Tool first + - https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093 +- Found Distributed Android Testing pre-squash real initial webhook commit + - Jul 27, 2015 - 7130e89473f12353f19afb935802b065759be571 + - > A webserver to receive json web hooks from gitlab_webhooks + > The hooks are dealt with by calling the corresponding function in + > hooks.py. For example a push is received so the function push in + > hook.py is called and passed the hook data. + - Well friends, it's only been 2,674 days since our first commit down CI lane. + - Next step is we enable offline, offline CI that is, we'll knit together our + Data, Analysis, Control (DAC, aka Digital Analog Converter ;) loop that will + get our software lifecycle analysis going. We're going to look at the supply + chain of the thoughts (adding / using a dependency is a thought, it might also + be a thought you took action on). You are what you EAT and same goes for software! + Our analysis of the supply chains to our trains of thought seen within the + software lifecycle are analogous to the software project as the entity and our + analysis of what it's EATing is an analysis of it's digestion of those thoughts. + Okay I think I wrote this somewhere else and I'm not having success explaining + right now. It's also not so much offline CI as parity across environments, enabling + context (process, workflow, DX) aware application of policy / config / logic. + Aka the intermediate representation and the analysis pattern allow for translation. + As we get more advanced we'll be leveraging (and implementing) our cross domain + conceptual mapping (grep thread) techniques to translate these applications ad-hoc + as feasibility and need allows. + and our EAT wheel will start turning. + - [WIP: Rolling Alice: Coach Alice: You are what you EAT!](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3885559) + - This offline digestion is important to enable us to give Alice to developers + and help her sit side by side to help them. Today we focus on vulns, and + security patches (version bumps?, `safety` check? - https://intel.github.io/dffml/main/shouldi.html#use-command). Tomorrow might be linting + (`yamllint` for GitHub Actions). + - Using the NIST NVD style API we now have we can begin to issue events over that + stream. + - These events will be the communication of Alice's thoughts and actions, her + development activity. We'll of course incrementally introduce overlays which + increase sophistication of activities and intricacy of communications and + triggers. +- TODO + - [ ] For the meeting recording to markdown / rST we need to also screenshot if there is a deck presented + - [ ] Contribute NVDStyle pieces to cve-bin-tool as needed for https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093 + - [ ] SCITT receipts for each CVE (attached as separate record? attached within? wrapped?) + - [ ] [download_nvd](https://github.com/pdxjohnny/download_nvd) but somehow hybridized with https://github.com/vlcn-io/cr-sqlite for conflict free resolution deltas on the CVE Binary Database. + - Or maybe go the bzdiff route + - [ ] Finish scorecard demo and intergate into shouldi + - Put this in down the dependency rabbit hole again as one of the things we put in `THREATS.md` + - [ ] `alice threats cicd` (`-keys https://github.com/intel/dffml`) + - [ ] GitHub Actions workflow analysis overlays + - [ ] Look for `runs-on:` and anything not GitHub hosted, then + check `on:` triggers to ensure pull requests aren't being run. + - https://github.com/intel/dffml/issues/1422 + - [ ] Output to JSON source (so long as we derive from `RunRecordSet` we'll be done with this)\ + - [ ] Have NVDStyle server take source as input/config so that we can point it at the discovered vulns + - [ ] Track https://github.com/intel/cve-bin-tool/issues/2320#issuecomment-1303174689 + in relation to `policy.yml` + - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#what-is-alice + - [ ] `alice please log todos -source static=json dynamic=nvdstyle` + - [ ] Implement source for reading from NVDSytle API (op source for single function prototype?) + - [ ] Enable creation of TODOs by overlaying operations which take the feature data as inputs (use dfpreprocess?) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0094/index.md b/docs/discussions/alice_engineering_comms/0094/index.md new file mode 100644 index 0000000000..f100715562 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0094/index.md @@ -0,0 +1 @@ +# 2022-11-22 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0094/reply_0000.md b/docs/discussions/alice_engineering_comms/0094/reply_0000.md new file mode 100644 index 0000000000..189fa28b12 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0094/reply_0000.md @@ -0,0 +1,91 @@ +## 2022-11-22 @pdxjohnny Engineering Logs + +- https://www.science.org/doi/10.1126/science.ade9097 + - Some people did the diplomacy civ style thing + - grep `docs/arch/alice/discussion` thread + - https://youtu.be/u5192bvUS7k + - https://twitter.com/ml_perception/status/1595070353063424000 +- Rebased in cve-bin-tool@main to [nvd_api_v2_tests](https://github.com/pdxjohnny/cve-bin-tool/compare/nvd_api_v2_tests) in pursuit of https://github.com/intel/cve-bin-tool/issues/2334 + +[![asciicast](https://asciinema.org/a/539495.svg)](https://asciinema.org/a/539495) + +- https://github.com/OR13/didme.me/issues/18 + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0007_an_image.md +- https://twitter.com/tlodderstedt/status/1592641414504280064 + - https://openid.net/openid4vc/ + - OpenID for Verifiable Credentials (OpenID4VC) + - https://www.slideshare.net/TorstenLodderstedt/openid-for-verifiable-credentials-iiw-35 + - https://openid.bitbucket.io/connect/openid-connect-self-issued-v2-1_0.html#name-sharing-claims-eg-vc-from-s + - The following quotes are applicable to our DFFML CI/CD setup. + We care about static analysis results and stuff (`alice shouldi`), + for example auth of our runners (grep OSS scanning) and artifacts + to push data to `data.chadig|nahdig.com` and then to the OpenSSF. + - Ideally our data structures are self identifying and authing (UCAN, ATP, etc.) + - We still need bridges into existing identity and auth infra + - [DID + HSM Supply Chain Security Mitigation Option](https://github.com/intel/dffml/tree/alice/docs/arch/0007-A-GitHub-Public-Bey-and-TPM-Based-Supply-Chain-Security-Mitigation-Option.rst) + - https://www.youtube.com/clip/Ugkxf-HtFY6sR_-EnGGksIik8eyAKQACE0_n?list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK + - Vision: Reducing Overhead via Thought Communication Protocol + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md + - [2022-10-15 Engineering Logs: Rolling Alice: Architecting Alice: Thought Communication Protocol Case Study: DFFML](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3883683) + - The video this was clipped from was linked in the commit message https://github.com/intel/dffml/commit/fc42d5bc756b96c36d14e7f620f9d37bc5e4a7fd + - Found the previous stream of consciousness aligned with this. I had been meaning to look for it, we'll be back in this train of thought when we get to didme.me "An Image" python implementation. + - https://www.youtube.com/watch?v=9y7d3RsXkbA&list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK + - > [2.4. ](https://openid.bitbucket.io/connect/openid-connect-self-issued-v2-1_0.html#section-2.4)[Sharing Claims (e.g. VC) from Several Issuers in One Transaction](https://openid.bitbucket.io/connect/openid-connect-self-issued-v2-1_0.html#name-sharing-claims-eg-vc-from-s) +When End-Users apply to open a banking account online, in most countries, they are required to submit scanned versions of the required documents. These documents are usually issued by different authorities, and are hard to verify in a digital form. A Self-issued OP directly representing the End-User may have access to a greater set of such information for example in the format of Verifiable Credentials, while a traditional OP may not have a business relationship which enables access to such a breadth of information. Self-Issued OPs could aggregate claims from multiple sources, potentially in multiple formats, then release them within a single transaction to a Relying Party. The Relying Party can then verify the authenticity of the information to make the necessary business decisions. + - https://openid.net/wordpress-content/uploads/2022/06/OIDF-Whitepaper_OpenID-for-Verifiable-Credentials-V2_2022-06-23.pdf + - > OpenID Connect, a protocol that enables deployment of federated Identity at scale, was built with User-Centricity in mind. The protocol is designed so that the Identity Provider releases the claims about the End-User to the Relying Party after obtaining consent directly from an EndUser. This enables Identity Providers to enforce consent as the lawful basis for the presentation based on the Relying Party’s privacy notice. The protocol also enables two kinds of Identity Providers, those controlled by the End-Users and those provided by the third parties. Now, User-Centricity is evolving to grant the End-Users more control, privacy and portability over their identity information. Using OpenID for Verifiable Credentials protocols, the End-Users can now directly present identity information to the Relying Parties. This empowers the EndUsers to retain more control over the critical decisions when and what information they are sharing. Furthermore, the End-Users’ privacy is preserved since Identity Providers no longer know what activity the End-Users are performing at which Relying Party. End-Users also gain portability of their identity information because it can now be presented to the Relying Parties who do not have a federated relationship with the Credential Issuer. Then the technical details of OpenID4VC are presented, alongside an explanation of certain decision choices that were made, such as why OpenID Connect, and OAuth 2.0 are well-suited as basis for presentation and issuance protocols for verifiable credentials. Finally, the whitepaper concludes by reiterating the importance of making choices for standards that meet certain use-cases in order to realize a globally interoperable verifiable credentials ecosystem. Achieving large-scale adoption of verifiable credentials will be "by Evolution, not by Revolution". The identity community can more swiftly empower people, and government authorities developing identity infrastructure and policies, by adopting standards like OpenID4VC that facilitate convergence and interoperation of existing and emerging standards. +- https://vos.openlinksw.com/owiki/wiki/VOS/VOSIntro +- https://github.com/OpenLinkSoftware/OSDS_extension +- https://hobbit-project.github.io/ +- https://youtube.com/clip/Ugkxf-HtFY6sR_-EnGGksIik8eyAKQACE0_n + - Vision: Reducing Overhead via Thought Communication Protocol +- https://cloud.hasura.io/public/graphiql?header=content-type:application/json&endpoint=https://api.graphql.jobs +- We're working on fixing the CI right now + - The vuln serving `NVDStyle` is our base for comms right now (think manifests) + - https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md + - This is how we will be facilitating Continuous Delivery. + - Open source projects will implement vuln stream handling, we are + hopefully piggy backing our `FROM` rebuild chain and so forth on top, + once again, we're always looking for reliable resilient ubiquitously + available comms. Reuse, reuse, reuse. +- https://github.com/intel/dffml/issues/1421 +- Found some meetups to share Alice with +- https://www.meetup.com/rainsec/events/289349686/ + - > RainSec - PDX Information Security Meetup: RainSec is an informal group of like-minded security professionals who meet to network and discuss topics of interest in a non-work, non-vendor setting. While our target audience is experienced information security professionals, this is a public event open to any interested parties. If you have a friend or colleague who might benefit, please pass an invite along. +- https://www.meetup.com/hardware-happy-hour-3h-portland/events/289759128/ + - > Hardware Happy Hour is an informal way to socialize, show off your projects, and talk about the world of hardware. +- https://www.meetup.com/ctrl-h/events/282093316/ + - > Dorkbot PDX (Virtual): Part virtual hackathon, part virtual geek social, these virtual biweekly meetings are a time for you to virtually join others for insight, inspiration or just insanity. + - https://app.gather.town/app/1KLgyeL4yGzBeCAL/dorkbot +- https://app.gather.town/app + - UX wow. Landing into anon profile allowing actions / creation. Love it. +- https://mastodon.online/@rjh/109388793314837723 + - > nsrllookup.com is back online after a long pandemic-related hiatus. If you need to sort wheat from chaff for large volumes of data, try removing every piece of data in NIST's collection. + > + > Many thanks to [@warthog9](https://mastodon.social/@warthog9@social.afront.org) for hosting nsrllookup.com all these years. :) + - https://github.com/rjhansen/nsrlsvr + - We should hybridize this with SCITT recpeits returned for the content addresses, let's use SHA384 or something stronger + - https://mastodon.online/@rjh/109388812626470845 + - Let's use this hybrid with the NVDStyle API, or perhaps let's wait (8 minutes ago, Chaos smiles on us again ;) Really we should stick with OCI registry on our first pass here. + - > Work on version 2 of nsrllookup is well underway. When I originally developed it, I elected to write my own very simple wire protocol. Although it still works fine, it means whenever I want to write a binding for a new programming language I have to rewrite the parser-generator. + > + > Version 2, currently underway, moves to gRPC. This should make it much easier to integrate with third-party tools like Autopsy. +- Random LOL + - Architecting Alice: Volume 0: Context: Part 1: Where are we: YouTube's automated captions: "Intro, The Plan, Alice, Chaos, Nested Virtualization" + - Hit the nail on the head with that one ;P + +[![Architecting Alice: Volume 0: Context: Part 1: Where are we: YouTube's automated captions LOL: "Intro, The Plan, Alice, Chaos, Nested Virtualization"](https://user-images.githubusercontent.com/5950433/203405118-91f1d2d8-a9f7-42e8-a468-d984e7f7d7ae.png)](https://www.youtube.com/watch?v=dI1oGv7K21A&list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK) + +- https://docs.velociraptor.app/ +- https://www.thc.org/segfault/ + - https://github.com/hackerschoice/segfault + - Stoooooked +- https://www.thc.org +- https://www.gsocket.io/ + - Doooooooooope + - Let's see if there's a cross with DERP here, Wireguard is probably involved. + - > [![gsocket-asciicast](https://asciinema.org/a/lL94Vsjz8JM0hCjnfKM173Ong.svg)](https://asciinema.org/a/lL94Vsjz8JM0hCjnfKM173Ong) +- https://github.com/vanhauser-thc/ +- TODO + - [ ] Finish https://github.com/intel/cve-bin-tool/issues/2334 + - https://github.com/intel/cve-bin-tool/pull/2384 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0095/index.md b/docs/discussions/alice_engineering_comms/0095/index.md new file mode 100644 index 0000000000..f6fc7f8c97 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0095/index.md @@ -0,0 +1 @@ +# 2022-11-23 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0095/reply_0000.md b/docs/discussions/alice_engineering_comms/0095/reply_0000.md new file mode 100644 index 0000000000..2b10eb811d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0095/reply_0000.md @@ -0,0 +1,94 @@ +## 2022-11-23 @pdxjohnny Engineering Logs + +- [alice: threats: cicd: github: workflow: Check for curl -k #1423](https://github.com/intel/dffml/issues/1423) +- [alice: threats: cicd: github: workflow: Guess at if input should be passed as secret #1424](https://github.co/intel/dffml/issues/1424) +- Alice, what entities are working on aligned trains of thought + - Assumes current context + - Could also specify train of thought via DID or petname or shortref or whatever + - Overlap in architecture heatmaps + - Overlap in conceptual upleveling + - Add in related todos (GitHub issues Anthony has been working on NVD APIv2 related) + - Graphs are fun + - [WIP Rolling Alice: ?: ? - Working Title: Overlays as Dynamic Context Aware Branches](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4180716) + - [2022-10-15 Engineering Logs: Rolling Alice: Architecting Alice: Thought Communication Protocol Case Study: DFFML](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3883683) + +![meme-anarchy-elmo-knowledge-graphs-for-the-Chaos-God](https://user-images.githubusercontent.com/5950433/203634346-111c884d-0f95-4066-addf-dbfbaeda4910.png) + +```console +$ git clone https://github.com/pdxjohnny/cve-bin-tool -b nvd_api_v2_tests +$ cd cve-bin-tool +$ alice please tell me who is working on aligned trains of thought +anthonyharrison +$ alice please create state of the art virtual branch from those contributors and myself +... runs cherry-picking cross validation / A/B feature flag testing the commits ... +... cached state from team active dev sessions, CI, etc. via active overlays ... +... which means this could be no-exec, pure static eval and creation based of ... +... cherry-picks and their graph linked test results, see Zephyr recent stuff ... +$ echo As mentioned to Andy, this allows multiple devs to iterate in parallel. +$ echo The metric data coming out of this also facilitates our EAT wheel turning. +$ echo Data via context aware overlays (local dev, cloud dev, CI/CD) are is available +$ echo for offline/online/aggregate Data, Analysis, Control across ad-hoc orgs. +$ echo Entities can then configure rewards for aligned work and policies around +$ echo qualifications, compute contract negotiation, etc. (grep discussion). +``` + +- https://github.com/intel/dffml/pull/1401/commits/37ea7855ec88ad804724be662a7963d2af481304 + - `docs: tutorials: rolling alice: architecting alice: introduction and context: Mention the scary part` + - It [AGI entities] will also have concepts "larger" than our own, we need to make sure +it does not manipulate us in ways we don't even understand. + - How? + - Genericizing Conceptual Upleveling + - Data Provenance (+ ^) + - Context Aware Trust Chains + - [Architecting Alice: Volume 0: Context: Part 14: Cross Domain Conceptual Mapping to Reach Equilibrium](https://www.youtube.com/watch?v=A-S9Z684o4Y&list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK) +- Some interesting potentially aligned trains of thought found via https://blueprint.bryanjohnson.co/ + - Related + - [2022-11-06 @pdxjohnny Engineering Logs: EDEN v0.0.2 draft](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4068656) + - [2022-11-13 @pdxjohnny Engineering Logs: Alice ASAP](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4128767) + - https://medium.com/future-literacy/my-goal-alignment-problem-d90e0c14b717 + - > There are many versions of you constantly competing for dominance in achieving their own goals. Frequently opposing one another. The texture of their goals varies according to the time of day, what you last ate, and how you slept the night before, among other things. Trying to accurately predict the goals of your future selves is elusive at best. Meanwhile, you do your best to smooth over these differences and pretend as though there is a singular unified you with fixed goals. We all do. + - Our parallel conscious states + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_preface.md#rolling-alice + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md + - https://medium.com/future-literacy/autonomous-self-fe2dfa755b74 + - > Why I Care: The Future of Intelligent Life + > + > Feeling great is alone worth the effort but my greater interest in the Autonomous Self is in trying to figure out a path to the future of being human. My primary hypothesis: Our future existence requires that we level ourselves up as a species, and at the fastest evolutionary speed in history. To do this, we need to free ourselves of the costly metabolic things we do today, such as rote or biased decision making and logistics management around solvable things such as sleep and biomarker-based diet, exercise, or lifestyle. Leveling us up to spend our precious time and energy to explore the frontiers of being human rather than things we know how to do efficiently. What will happen? + > + > It’s hard to imagine what our minds will do with a new abundance of energy, but we have a precedent: Fire. Fire freed our ancestors from certain caloric and dietary restrictions, which opened up energy — i.e. metabolism/time — for little things like language and society as we know it to develop. I believe a fully Autonomous Self will open up, again, just as much energy. One can only dare imagine what we will do with it. We will have the opportunity to develop new industries, discover original uses of the mind, make iterations of governance and economics, and explore the goal alignment problem within ourselves, between each other, and with AI. + > + > How far away is this? It’s already begun. + > + > Inner Space Exploration +- https://w3c.github.io/dpv/examples/#E0027 + - Let's try to mess with this linked data wise after we finish out the NIST NVD Style tests +- https://mobile.twitter.com/DrJimFan/status/1595459499732926464 + - https://github.com/MineDojo/MineDojo + - https://arxiv.org/pdf/2211.10435.pdf +- Prophecy still being fulfilled (no surprises here) + - PAL: PROGRAM-AIDED LANGUAGE MODELS + - Program of Thoughts Prompting: Disentangling Computation from Reasoning for Numerical Reasoning Tasks + - https://wenhuchen.github.io/images/Program_of_Thoughts.pdf +- TODO + - [ ] Circle back with Harsh + - [ ] Integrate old shouldi code for him to build off + - [ ] Update [Down the Dependency Rabbit-Hole Again](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md) + - [ ] Do the NVD Style with pytss (mock the vulns if you have to, swap the `Source`), + - [ ] Source (OpSource?) for static file defining all VEX + - [ ] CVE Binary Tool update to output VEX + - [ ] Dump in mock data if we can't find any vulns (could try building with old containers, be sure to build off hashes / SHA values / resolved tags) + - [ ] `alice please contribute vuln response -source mynvdstyleserver=nvdstyle` to bump container build version or something. + - [ ] Add in Harsh's work and then also leverage `alice shouldi use` (Python `safety` operations / overlays) + - [ ] `alice please contribute vuln response` to bump python version or run a tool that knows how to do that, the point is VEX in (with SCITT receipts), dispatch (manifest instances) for patches (or just the patches themselves, the operation and parameter set pair used for dispatch is the manifest instance, is the data in the open linage data hop) + - [ ] Translate this basic static file local vuln finding and remediation + into CI/CD specific to our GitHub Actions setup. + - This is our POC of downstream validation between projects (our + stream of consciousness, our continuous delivery). + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md + - This is also what helps enable us to declare "2nd party plugins", + our "ordained" sets of plugins which meet some qualifications. + - [ ] [2022-04-18 1:1 John/John - LTM and DFFML: Andersen to implement caching](https://github.com/intel/dffml/discussions/1368#discussioncomment-2599017) :grimacing: + - [ ] For Vol 3: The other entities you are around can expand or close your consciousness [Danica] + - [ ] During reflection (vol 2, 4,5?) we can look into things an see what we used to see as binary we can see through later cross domain conceptual mapping and feature extraction through a new lense (different overlayed strategic plans) + - [x] Thread backup + - https://gist.github.com/pdxjohnny/928c6ae9bd757940299732c5fcb4c8ac \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0096/index.md b/docs/discussions/alice_engineering_comms/0096/index.md new file mode 100644 index 0000000000..3e38877159 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0096/index.md @@ -0,0 +1 @@ +# 2022-11-24 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0097/index.md b/docs/discussions/alice_engineering_comms/0097/index.md new file mode 100644 index 0000000000..6211298157 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0097/index.md @@ -0,0 +1 @@ +# 2022-11-25 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0097/reply_0000.md b/docs/discussions/alice_engineering_comms/0097/reply_0000.md new file mode 100644 index 0000000000..c76a72bae0 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0097/reply_0000.md @@ -0,0 +1,7 @@ +## 2022-11-25 @pdxjohnny Engineering Logs + +- End-to-end Algorithm Synthesis with Recurrent Networks: Logical Extrapolation Without Overthinking + - https://arxiv.org/abs/2202.05826 + - Tom Goldstein: https://twitter.com/tomgoldsteincs/status/1596210087479345152 + - > Strangely, the network has also learned an error correcting code. If we corrupt the net's memory when it's halfway done, it will always recover. If we change the start/end point after the maze is solved, it draws the new solution in one shot with no wrong turns (shown below). + - > [![ecc-on-alg-synth-thumbnail](https://user-images.githubusercontent.com/5950433/204303675-6a476410-5f5c-4fdc-88ba-89222fc65df3.png)](https://user-images.githubusercontent.com/5950433/204303194-b308ff58-a1a0-4715-b109-5739fc4e2474.mp4) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0098/index.md b/docs/discussions/alice_engineering_comms/0098/index.md new file mode 100644 index 0000000000..aaaf57e235 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0098/index.md @@ -0,0 +1 @@ +# 2022-11-26 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0099/index.md b/docs/discussions/alice_engineering_comms/0099/index.md new file mode 100644 index 0000000000..178a92e1a4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0099/index.md @@ -0,0 +1 @@ +# 2022-11-27 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0099/reply_0000.md b/docs/discussions/alice_engineering_comms/0099/reply_0000.md new file mode 100644 index 0000000000..3b56f47dc3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0099/reply_0000.md @@ -0,0 +1,24 @@ + ## 2022-11-27 1:1 Tom/John + +- Smart planter + - https://github.com/pdxjohnny/smartplanter + - https://github.com/pdxjohnny/tl + - https://github.com/pdxjohnny/smartplanteresp + +![image](https://user-images.githubusercontent.com/5950433/204325389-96bbe2e5-9b99-4819-80e8-359b9ef6bb58.png) + + +- https://spinoff.nasa.gov/indoor-farming +- Federated urban fish farms? Supply chain management is critical, can of salmon for "grandma" example + - Streams of risk tolerances with forecasting (and risk on forecasts) + - Peer to peer + - Our aggregate barter + - Reuse infra where available + - Food safe cylindrical + - Containers + - Tom: Don't forget to think outside the box! (huh-HUH!) +- Do software first! How do we start to think about this experation of resource use case? CVE lifetime? Oooooh I like that + - Do the CVEs / codebase / time model to do rough prediction (cvedetails) + - "red card pull" + - Ping Geremy **AFTER** you do this, stop bugging him until you fixed the CI and have something that he can play with! + - Down the Dependency Rabbit Hole Again \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0099/reply_0001.md b/docs/discussions/alice_engineering_comms/0099/reply_0001.md new file mode 100644 index 0000000000..8e4e1c16d6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0099/reply_0001.md @@ -0,0 +1,23 @@ +## 2022-11-27 @pdxjohnny Engineering Logs + +- https://github.com/IntelAI/models/releases/tag/v2.9.0 + - > Supported Frameworks + > - Intel® Optimizations for TensorFlow v2.10.0 + > - PyTorch v1.13.0 and Intel® Extension for PyTorch v1.13.0 + > - Intel® Extension for PyTorch v1.10.200+gpu + > - Intel® Extension for TensorFlow v1.0.0 + > + > New models + > - PyTorch AphlaFold2 + > - New precisions BF32 and FP16 for PyTorch BERT Large + > + > New features + > + > - dGPU support for Intel® Data Center GPU Flex Series using Intel® Extension for PyTorch v1.10.200+gpu and Intel® Extension for TensorFlow v1.0.0 + > - Intel® Neural Compressor Int8 quantized models support for TensorFlow image recognitions topologies (ResNet50, ResNet101, MobileNet v1, Inception V3) + > - Add support for running TensorFlow and PyTorch inference on Windows client + > - Add support for running models on Ubuntu 22.04 + > - Updated Transfer Learning Jupyter notebooks +- TODO + - [ ] Alice, wrap and distributed as PyPi packages all pretrained models from IntelAI/models + - Automated package creation code for on demand packages: https://github.com/intel/dffml/blob/1513484a4bf829b86675dfb654408674495687d3/dffml/operation/stackstorm.py \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0100/index.md b/docs/discussions/alice_engineering_comms/0100/index.md new file mode 100644 index 0000000000..840756a0a4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0100/index.md @@ -0,0 +1,5 @@ +# 2022-11-28 Engineering Logs + +- TODO + - [ ] Move this thread to something that doesn't choke machines on load (i.e. Laptop, Phone, etc.) + - grep thread render \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0100/reply_0000.md b/docs/discussions/alice_engineering_comms/0100/reply_0000.md new file mode 100644 index 0000000000..bcdfb9a436 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0100/reply_0000.md @@ -0,0 +1,92 @@ +## 2022-11-28 @pdxjohnny Engineering Logs + +- https://github.com/pdxjohnny/use-cases/commit/36b4578a8ae7978f55c10e4e0a2eabd88788da27 +- Reminder (on/off chain smart contracts! ref: https://github.com/intel/dffml/blob/alice/docs/arch/0009-Open-Architecture.rst it sounds block chainy but it's just a cyptographiclly linked list created ad-hoc with your neighbors! [grand scale speaking ;]) + - https://github.com/intel/dffml/blob/c7dc8985fdde61459017d3fb39cb19de1f7ece2b/docs/arch/0009-Open-Architecture.rst#L32-L36 +- From 2022-11-17 Mark Foster on Twitter https://twitter.com/mfosterio/status/1593094082838290433 + - > Proof of Trust On The Internet (https://futureinternet.io) + > + > We are seeing repeats of behavior on centralized false ledger systems. + > + > I’ve had so many people calling me and asking about verification of decentralized ledgers ever since the fiasco of FTX and how to create systems to prevent Fraud. + > + > We should utilize cryptographic Merkle data structure proofs with open vocabularies to verify ownership, control of data and the internet of value (IOV) + > + > - Presentation Exchange DIF Foundation + > - https://identity.foundation/presentation-exchange/ + > - Linked Open Vocabularies + > - https://schema.org/InvestmentOrDeposit + > - Web Authentication binded to a Human Input Device (HID) like a finger print scanner on your phone + > - w3.org/TR/webauthn-2/ + > - Verifiable Credential W3C Recommendation + > - https://www.w3.org/TR/vc-data-model + > - Merkle Tree DAG CIDs + > - https://docs.ipfs.tech/concepts/merkle-dag/ + > - > A Merkle DAG is a DAG where each node has an identifier, and this is the result of hashing the node's contents - any opaque payload carried by the node and the list of identifiers of its children - using a cryptographic hash function like SHA256. This brings some important considerations: + > > - Merkle DAGs can only be constructed from the leaves, that is, from nodes without children. Parents are added after children because the children's identifiers must be computed in advance to be able to link them. + > > - Every node in a Merkle DAG is the root of a (sub)Merkle DAG itself, and this subgraph is contained in the parent DAG. + > > - Merkle DAG nodes are immutable. Any change in a node would alter its identifier and thus affect all the ascendants in the DAG, essentially creating a different DAG. Take a look at this helpful illustration using bananas (opens new window)from our friends at Consensys. + > > + > > Merkle DAGs are similar to Merkle trees, but there are no balance requirements, and every node can carry a payload. In DAGs, several branches can re-converge or, in other words, a node can have several parents. + > > + > > Identifying a data object (like a Merkle DAG node) by the value of its hash is referred to as content addressing. Thus, we name the node identifier as Content Identifier, or CID. (John: Or DID! [Alice Engineering Comms: 2022-11-08 Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4083171)) + > - https://proto.school/merkle-dags + > - Decentralized IDs (DID) W3C Recommendation + > - https://www.w3.org/TR/did-core/ + > - Secure Interoperable Wallets + > - https://w3c-ccg.github.io/universal-wallet-interop-spec/ + > - https://openwallet.foundation + > - There are many moving parts but the methodology research has been done. let’s build on top of the ecosystem of the future. +- TODO + - [ ] Play with them there context aware Markov chains! (etc.) + - Maybe useful https://github.com/karpathy/minGPT/blob/master/mingpt/model.py + - [ ] https://github.com/intel/cve-bin-tool/pull/2384 + - CD and cross plugin/project analysis is dependent on this as a dependency of our + standard interface / documentation aka manifests. Also the vuln updating (goes with + the teritory, this is what we are using to ride on top of as comms channel). + - https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md + - [ ] UCAN/IPVM need to review :eyes: + - [ ] https://github.com/ipvm-wg/spec/pull/8#issuecomment-1328355077 + - https://github.com/ipvm-wg/spec/blob/initial-job-spec/README.md + - [ ] https://github.com/ucan-wg/invocation/pull/1#issuecomment-1327979869 + - [ ] https://github.com/fission-codes/spec/tree/main/dialog + - [ ] https://github.com/ucan-wg/spec/issues/30#issuecomment-1321511824 + - > Brooklyn: In principle, if you're willing to deterministically encode the CWT, then you should be able to use the canonicalization spec and/or ucan-ipld to convert to/from CWT. Does that meet your CWT needs? + - [ ] Ping Marc about Zephyr stuff (POC? :) + - [ ] We should move DFFML flows to the IPVM style once available, or a configloader loadb/dumpb or something (dataflow?) for the time being + - [ ] https://github.com/intel/dffml/issues/1425 + - [ ] Really need to do the chains of contexts stuff which will also double as + the `alice shouldi contribute`. There is likely an issue with the current + `source.update()` just merging over the old data, which means if something + is no longer "qualified" or similar, that won't get overwritten, we want to + have a `source.update()` mode which serializes the train of thought / pre updates. + This likely also requires updates to `record.evaluated()` to create new instances + of record data. Might be useful for when `record.data.key` needs changing such + as when a `GitHubRepoID` is the key and it should be `record.feature("repo_url")` + or something similar. + - https://github.com/intel/dffml/blob/alice/entities/alice/alice/shouldi/contribute/cicd.py + - 90d5c52f4dd64f046a2e2469d001e32ec2d53966 + - The instructions unfortunately I don't think work from this commit message, because it's the same main package, we need to setup the lightweight package stuff as was done here + - https://github.com/intel/dffml/blob/1513484a4bf829b86675dfb654408674495687d3/dffml/operation/stackstorm.py#L306-L368 + - https://github.com/intel/dffml/issues/1418 + - [ ] `Record` feature data should retain dataflow `Input` type data if possible. + - Ideally we enable graph traversal, once again only need one link deep if data + is available offline. Try resolution via DID, CID, OA, etc. + - We should also support serialization of only the latest system context / + the state of the art for a train of thought / chain of system context. + - State of the art could be defined by any set of strategic plans. + - :bulb: Unification of Record / DataFlow / once working system context + infra plus UCANs/DIDs/DIDs/IPVM/OA on chain should allow for cohesive / cleaner + and more consistent context capture / unbroken chains for both data and compute. + - And this is why we've started planning before implementing folks, woohoo! + - Measure twice cut once. + +--- + +Thank you expede! I'm hoping to dive in this week to look at all your recent developments. + +Pining marc-hb, Brooklyn is the brains behind the correct implementation of the `sort_keys=True` -> CBOR situation + +- References + - [Alice Engineering Comms: 2022-11-08 Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4083171) + - [Alice Engineering Comms: 2022-11-28 Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4250447) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0101/index.md b/docs/discussions/alice_engineering_comms/0101/index.md new file mode 100644 index 0000000000..389016a76c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0101/index.md @@ -0,0 +1 @@ +# 2022-11-29 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0101/reply_0000.md b/docs/discussions/alice_engineering_comms/0101/reply_0000.md new file mode 100644 index 0000000000..614eb016ab --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0101/reply_0000.md @@ -0,0 +1,176 @@ + ## 2022-11-29 @pdxjohnny Engineering Logs + +- SCITT + - Federation via DWN + - https://github.com/TBD54566975/dwn-relay/blob/main/example/config.js + - https://github.com/TBD54566975/dwn-sdk-js/blob/main/tests/interfaces/protocols/handlers/protocols-query.spec.ts + - https://www.blockcore.net/platform + - https://github.com/block-core/blockcore-vault + - https://developer.tbd.website/projects/web5/ + - https://github.com/TBD54566975/ssi-service + - Status reproduced below for quick reference / herstorical reference + - > - [x] [DID Management](https://www.w3.org/TR/did-core/) + > - [x] [did:key](https://w3c-ccg.github.io/did-method-key/) + > - [ ] [did:web](https://w3c-ccg.github.io/did-method-web/) + > - [ ] [did:ion](https://identity.foundation/ion/) + > - [x] [Verifiable Credential Schema](https://w3c-ccg.github.io/vc-json-schemas/v2/index.html) Management + > - [x] [Verifiable Credential](https://www.w3.org/TR/vc-data-model) Issuance & Verification + > - [x] Signing and verification with [JWTs](https://w3c.github.io/vc-jwt/) + > - [ ] Signing and verification with [Data Integrity Proofs](https://w3c.github.io/vc-data-integrity/) + > - [x] Applying for Verifiable Credentials using [Credential Manifest](https://identity.foundation/credential-manifest/) + > - [ ] Requesting, Receiving, and the Validation of Verifiable Claims + > using [Presentation Exchange](https://identity.foundation/presentation-exchange/) + > - [ ] Status of Verifiable Credentials using the [Status List 2021](https://w3c-ccg.github.io/vc-status-list-2021/) + > - [ ] Creating and managing Trust documents using [Trust Establishment](https://identity.foundation/trust-establishment/) + > - [ ] [DID Well Known Configuration](https://identity.foundation/.well-known/resources/did-configuration/) documents +- Smart Cities + - https://www.city-chain.org/ + - https://start.city-chain.org/ + - This is pretty blockchain "coin" (a word we'll eventually forget) focused content. + - https://github.com/sondreb this dude looks aligned + - https://github.com/pdxjohnny/smartcities +- Threat Modeling + - Attacks over time + - https://www.zdnet.com/article/sha-1-collision-attacks-are-now-actually-practical-and-a-looming-danger/ +- Saw article about Alex Hanna quitting due to ethical concerns, previously reached out to Blake Lemoine + - Twitter direct message to Blake: [Rolling Alice: Forward: The Consciousness Folks](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_forward.md#the-consciousness-folks) + - Reaching out to DAIR + - https://dair.ai/ + - > DAIR.AI aims to democratize Artificial Intelligence (AI) research, education, and technologies. + - https://discord.com/channels/934159490205491311/934853197921681448 + - Whooooooaaa there buddy, This guy works for Facebook! Ruh Rough! Missalignement detected! + - Hmmm +- Need to submit to PyCascades + - [If You Give A Python A Computer](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md#if-you-give-a-python-a-computer) + - Fuck ya [Whisper](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0004_writing_the_wave.md) + +```console +$ PS1="alice@wonderland # " +``` + +```console +alice@wonderland # alice --help +usage: alice [-h] [-log LOG] {please,shouldi,threats,version} ... + + .,*&&888@@#&:, + .:&::,...,:&#@@@#:. + .o,. ..:8@@#@@+ + .8o+,+o*+*+,+:&#@@#8@@. + &8&###@#&..*:8#@@#@#@@&+. + ,@:#@##@@8,:&#@@@###@88@@. + ,#@8&#@@@#o:#@@@@#8#@#8+&#. + +8####@@@@###@@@888#@@@#oo#. + .*8@###@@@@@@@@@#o*#@@#@@#8o@, + +###@#o8&#@@##8::##@@@&&#@8#&+ + o@8&#&##::.,o&+88#&8##8*@@#@#, + .##888&&oo#&o8###8&o##8##&####8, + .&#@8&:+o+&@@@#8#&8:8@@@@@#8@@@oo+ + ,&&#@##oo+*:@###X,@@@@#@o&##&8#@o,. + ,#&###@@8:*,#o&@@@@##:&#@###*.&o++o#@@#&+ + o8&8o8@#8+,,#.88#@#&@&&#@##++*&#o&&&#@@@@. + *88:,#8&#,o+:+@&8#:8@8&8#@@&o++,*++*+:#@@*. + .+#:o###@8o&8*@o&o8@o888@@@o+:o*&&,@#:&@@@, + *+&@8&#@o#8+8*#+8#+88@@@@@@&@###8##@8:*, + +o.@##@@@&88@*8@:8@@@@@@:.. ,8@:++. + +&++8@@@@##@@@@@@@@@@@+ 88 + &. *@8@:+##o&888#@@@, .#+ + &. ,@+o,.::+*+*:&#&, ,@. + &. .@8*,. ,*+++.+* :8+ + :+ .#@::. .8:.:** .8@@o, + .o. #@+ :@,.&* .:@@@@@@8**. + +&. :@o,+.*o,*, .*@@@@@@@@@@#o + .*:&o. 8@o:,*:, .o@@#8&&@@@@#@@@* + ,*:+:::o.*&8+,++ ,&@@#: * :@@88@@@#:. + ,::**:o:.,&*+*8: *8@@##o *,.8@@#8#@#@#+ + *:+*&o8:. ,o,o:8@+o@@88:*@+ +: +#@#####8##&. + ,:&::88&, .&:#o#@@@#,+&&*#&. .:,.&#@#88#####&, + +::o+&8:. :##88@@@@:.:8o+&8&. .. +8###&8&##&88* + .:*+*.8#: ,o*.+&@@#@8,,o8*+8##+ .+#8##8&⊸:. + ,:o., . .:8*. .o, &#,*:8:+,&*:, .8@@#o&&##8:. + .*o.*,+o8#* +8&, .::. .88.+:8o: ,+:, ,o#@#8&o8##+ + +o, .+,,o#8+,8@o**.,o*, :8o +*8#* +&, ,*o@@#@&8&oo8&:, + oo*+,,,*8@#..&@8:**:oo+. +8#* *+#@:...oo+ .**:8@@@ooo&:&o##+ + ::+..,++#@,.:##o&o**,....oo#++#8#@:.,:8&:.....*&@@#:oo*&oo&#@* + .+**:*8@o,+##&o:+,,,+,,o*8#,,8@#@:,,+*o*++,,,,+&#@8*8o88&::*. + ..8@++#@#88:,,,.,,,:+#&,,#@@#:,,.,&o*,.+++*:#@8+:*+. + +:&8#@@##8&+,,,***@&,.8@@@*,,,.:o8&o&*o&o&o. + ...,*:*o&&o*8@@&o8@@@8+,,+:&&:+,... + o@#@@@@#@@@@@@@,..... + ,@##@@88#@@@@@8 + 8+.,8+..,*o#@+ + *o *+ #8 + 8, ,& +@* + +& &, .@#. + o* ,o o@& + .8. 8.,o#8 + 8. 8.,.&@:*:&@. + :@o:#,,o8&:o&@@. + .@@@@@@@@@@@#8. + ,*:&#@#&o*, + + /\ + / \ + Intent + / \ + / \ + / \ + / \ + / \ + / Alice is Here \ + / \ + / \ + /______________________\ + + Dynamic Analysis Static Analysis + + Alice's source code: https://github.com/intel/dffml/tree/alice/entities/alice + How we built Alice: https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice + How to extend Alice: https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst + Comment to get involved: https://github.com/intel/dffml/discussions/1406 + + +positional arguments: + {please,shouldi,threats,version} + +options: + -h, --help show this help message and exit + -log LOG Logging Level +``` + +- [Alice CLI c7dc8985fdde61459017d3fb39cb19de1f7ece2b Screenshot from 2022-11-29 21-15-40](https://user-images.githubusercontent.com/5950433/204716912-41dc0d86-86d6-4031-a2f2-fa7599ff66cd.png) + +- https://colab.research.google.com/drive/1gol0M611zXP6Zpggfri-fG8JDdpMEpsI + +### Thread Backup + +- References + - https://github.com/cli/cli/issues/1268 + +```console +$ gh api graphql -F owner='intel' -F repo='dffml' -F query=@intial_discussion_query.graphql | tee output.json | python -m json.tool | tee output.json.formated.json +$ gh gist create -p -d "$(date): https://github.com/intel/dffml/discussions/1406?sort=new https://github.com/intel/dffml/blob/alice/scripts/dump_discussion.py" output.json.formated.json scripts/dump_discussion.py +``` + +- TODO + - [x] Thread backup + - https://gist.github.com/pdxjohnny/b0b779a419c9ec7d55e1f21ff2261987 + - [ ] Fix duplicate issue creation + - [ ] Provide alice intergrated `shouldi use` or deptool or whatever for Harsh to build off. + - [ ] CVE Bin Tool + - [ ] https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md after CVE Bin Tool demo, then use dataflows for arch diagrams and do the c4model conceptual upleveling + - [ ] Review NPM RFC and mention in OpenSSF Metrics Use Case https://github.com/npm/rfcs/pull/626/files?short_path=9e1f9e7#diff-9e1f9e7b9ebe7e135d084916f727db5183eddd9bf2d9be73ca45444b6d74bfc9 + - [ ] Cross with https://scitt.io/distributing-with-oci-scitt.html + - [ ] Ping Arsa for feedback + - [ ] Play with entity definition conforming to https://w3c.github.io/dpv/examples/#E0027 + - [ ] Don't forget we have an *Affinity* for https://github.com/CrunchyData/pg_eventserv and how it can help with stream of consciousness / data aggregation from multiple sources and the event stream off that. + - [x] open.intel Threat Modeling Podcast + - [ ] Photo + - [ ] Bio + - [x] Enter the 36 chambers! It's the link I was looking for! (found randomly clicking on OA stuff) + - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0036/reply_0067.md + - [ ] https://katherinedruckman.com/an-optimistic-open-source-security-qa-with-christopher-crob-robinson + - [ ] Can we fix the CI and get Alice on here? Respond to Kate! https://www.intel.com/content/www/us/en/research/responsible-ai-publications.html + - [ ] https://github.com/chainguard-dev/melange/pull/184/files CHADIG + - [ ] https://github.com/intel/dffml/issues/1426 + - [ ] Need to submit to PyCascades + - [x] Post work for the day: DEFCON 2, a non-alcoholic cocktail: Groceries, Church, Powell's. Cost: $27, not going to DEFCON 1. Priceless. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0102/index.md b/docs/discussions/alice_engineering_comms/0102/index.md new file mode 100644 index 0000000000..59b02fafde --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0102/index.md @@ -0,0 +1 @@ +# 2022-11-30 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0102/reply_0000.md b/docs/discussions/alice_engineering_comms/0102/reply_0000.md new file mode 100644 index 0000000000..b21a394e20 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0102/reply_0000.md @@ -0,0 +1,225 @@ +## 2022-11-30 @pdxjohnny Engineering Logs + +- https://unix.stackexchange.com/questions/501577/if-else-in-jq-is-not-giving-expected-output +- https://twitter.com/SergioRocks/status/1597592532992532480 + - This dude Sergio really does rock! :metal: + - The Alice Initiative is all about scaling knowledge transfer! +- For posterity: [screenshot-of-ories-stable-diffusion-cyberpunk-archiver-ethical-ml](https://user-images.githubusercontent.com/5950433/204817902-684a4385-5197-456a-8910-2b2b41a16c5b.jpg) + - If time is relative then timing really is everything isn't it, it's all just a delta +- https://github.com/intel/dffml/commit/7f6aa4a4155420b5354ba6384f128a2f7f8d6605 + - https://en.wikipedia.org/wiki/Jam_tomorrow + - > "I'm sure I'll take you with pleasure!" the Queen said. "Two pence a week, and jam every other day." + > Alice couldn't help laughing, as she said, "I don't want you to hire me – and I don't care for jam." + > "It's very good jam," said the Queen. + > "Well, I don't want any to-day, at any rate." + > "You couldn't have it if you did want it," the Queen said. "The rule is, jam to-morrow and jam yesterday – but never jam to-day." + > "It must come sometimes to 'jam to-day'," Alice objected. + > "No, it can't," said the Queen. "It's jam every other day: to-day isn't any other day, you know." + > "I don't understand you," said Alice. "It's dreadfully confusing!" + - Alice is right, there MUST be jam today. Language is only what we use to describe. + While it is how we dictate in reality it does not dictate our reality! + - The past, present, and future exist simultaneously for them to exist at all. + - If we are completely describing our system context + - Each angle in our Trinity folds into the others if the others aren't there + - In describing it at all cause the cascading effect + - For there to be a tomorrow, there must be a today + - Cross ref: between the frames + - It's all just deltas + - This is how we exploit in vol 3 attack 2 + +### Manifest: Alice Log TODOs + +- Upstream + - Recurse with no overlay or orchestrator +- Overlay + - Populate JSON Source with a record with a repo name and records + - [`alice shouldi contribute`](https://github.com/intel/dffml/tree/alice/entities/alice/#contribute) + - Overlay + - `-sources dev=json -source-dev-filename .tools/open-architecture/innersource/repos.json -source-dev-readwrite -source-dev-allowempty` + - Dataflow to read project name and associated repos from config file + - Upstream + - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst#writing-operations + - Overlay + - Alice, please write an operation to read `repos` top level key similarly to how `name` was read. + - Repo with `myconfig` in it compliant to allowlisted operation implementations which read `repos` and `name` keys + - Repos As Given By JSON Source + - `-sources dev=json -source-dev-filename .tools/open-architecture/innersource/repos.json` + - Select project Repo IDs using `name` feature + - https://unix.stackexchange.com/questions/501577/if-else-in-jq-is-not-giving-expected-output +- Orchestrator + - gitpod +- Notes for Downstreams + - Use of `~` (User home directory) should be switched to a tempdir + +```console +$ python -m dffml list records -log debug -sources dev=json -source-dev-filename .tools/open-architecture/innersource/repos.json | tee ~/.projects.$(date -Iseconds).json +$ cat $(ls ~/.projects* | tail -n 1) | jq -r '.[] | select(.features.name == "My Custom Name") | .' | python -c "import sys, pathlib, json, yaml; print(yaml.dump(json.load(sys.stdin)))" +``` + +```yaml +extra: {} +features: + name: My Custom Name + repos: + - 0 +key: otherkey +``` + +```console +$ (for repo_id in $(ls ~/.projects* | tail -n 1) | jq -r '.[] | select(.features.name == "My Custom Name") | .features.repos[]'); do export repo_url=$(gh api --jq '.clone_url' "/repositories/${repo_id}"); echo "$repo_id $repo_url" && gh issue list --search "Recommended Community Standard:" -R "${repo_url}"; done) 2>&1 | tee .gh.issue.list.$(date -Iseconds).txt +``` + +### WebUI Discussion + +- Within WASM + - Pass manifest + - IPVM + - DataFlow + - #1300 + - HASH validation (similar to JSON schema?)? of stringified form for trampoline encoding (upstream : Input DID/CID) + - https://pyodide.org/en/stable/usage/api/js-api.html?highlight=globals#pyodide.unpackArchive + - https://pyodide.org/en/stable/usage/api/js-api.html?highlight=globals#pyodide.globals + - https://pyodide.org/en/stable/usage/api/js-api.html?highlight=globals#pyodide.loadPackagesFromImports + +```html + + + + + + + + + +``` + +```javascript +// hello_python.js +const { loadPyodide } = require("pyodide"); + +async function hello_python() { + let pyodide = await loadPyodide(); + return pyodide.runPythonAsync("1+1"); +} + +hello_python().then((result) => { + console.log("Python says that 1+1 =", result); +}); +``` + +### Infra roll call + +#### Domains + +- [x] dffml.org + - Keymakers: Saahil +- [x] chadig.com + - Keymakers: John +- [x] nahdig.com + - Keymakers: John + +### PyCascades + +- We want to present some of the core tutorial concepts, the system context + - ... or maybe it's about Alice? ... and the system context is more techniacl deails for a deep dive + - Obviouslly it's all about Alice, but the user faceing part ASAP (Alice ASAP), is the CLI and hopefully issue ops, etc. +- Description + - This talk will delve into the ever deepening rabbit hole of maintenance tasks we as developers end up doing to keep our software projects as healthy as possible. We'll start with an idea, the original sin if you will, following our train of thought until we have a little application we can kick the tires on. As our project's releases start rolling we'll begin building and refining policies and actions. Alice rolls with us as we overlay context aware responses to lifecycle events such as CVEs. We'll see how Alice helps us understand and strengthen our software's security posture and overall health as our software evolves over time. When all's said and done we'll have a secure rolling release in alignment with the project's strategic principles and values, measurable, auditable, actionable. Data, Analysis, Control (DAC). + - [image](https://user-images.githubusercontent.com/5950433/204975023-021a0e3e-4b74-460f-8f76-e7ca164af983.png) + - [2022-11-30 22-40-59-If-You-Give](https://user-images.githubusercontent.com/5950433/205342085-74ac0d95-3ab7-4b84-bf4b-2af355cccf2c.png) + +--- + +- TODO + - [x] Infra roll call + - [ ] **PYCASCADES!!!** + - [x] Updated https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md + - https://github.com/intel/dffml/commit/408d0ef29f60d0289fc2f7b6097faf8da9e6a8af + - Sourced from ^ + - [x] Picture + - [2022-11-30-profile-with-server-on-chain](https://user-images.githubusercontent.com/5950433/205323625-ddca2a42-f908-4e7b-936e-0d09d62af175.jpg) + - > Eventually we'll build this thing. It obviously works! LMAFOOOOOOO + - Original post on mastodon but maybe it was a little too much :P + - [x] Bio + - ~~Lives life with curiosity, understanding, and passion itself. Current focus is around leveraging threat model and architecture information to facilitate automated context aware decentralized gamification / continuous improvement of the security lifecycle / posture of open source projects.~~ + - Lives life with curiosity. Current focus is around leveraging threat model and architecture information to facilitate automated context aware decentralized gamification / continuous improvement of the security lifecycle / posture of open source projects. + - [ ] Search for fourth eye + - https://search.brave.com/search?q=the+fourth+eye&source=web + - http://apocalypsefatigue.org/dispensary/2018/2/11/the-fourth-eye + - Wow + - http://apocalypsefatigue.org + - http://apocalypsefatigue.org/score + - > Themes will be expanded, and techniques will be shared. We will beat the Game together. + - Shit, that email I wrote earlier and didn't send... + - [ ] Quote above pyjs wasm snippits + - [ ] Alice Please Contribute Issue Ops + - [ ] DevCloud GitHub Actions based melange OS DecentrAlice CI for DFFML for maintainer only execution (managing a secondary deployment, should be runnable same workflow on public or DevCloud based runners. + - [ ] https://github.com/chainguard-dev/crow-registry + - [ ] Local / open source / deployable equivalent no lock in v8workers runtime? + - [ ] Authenticated push via OIDC -> Notary -> SCITT Receipt patterns + - [ ] Cross-repo blob mounting + - [ ] OCI conformance + - https://github.com/opencontainers/distribution-spec/blob/main/spec.md + - https://github.com/oras-project/oras-py + - Upload metrics collected data via add hock package creation + - https://github.com/intel/dffml/blob/1513484a4bf829b86675dfb654408674495687d3/dffml/operation/stackstorm.py#L306-L368 + - [ ] Proxy to PyPi registry format + - [ ] Cross SCITT https://scitt.io/distributing-with-oci-scitt.html with NPM RFC and mention in OpenSSF Metrics Use Case https://github.com/npm/rfcs/pull/626/files?short_path=9e1f9e7#diff-9e1f9e7b9ebe7e135d084916f727db5183eddd9bf2d9be73ca45444b6d74bfc9 to produce reference env docs for OpenSSF and SCITT on how DFFML does inventory and manifests + - #1207 + - #1273 + - Use to update CI in #1401 + - Once CI works rebase main then rebase into main then we'll be rolling (slowly, but at least we'll have all systems green for the first time in a long time and be able to start acctually increasing acceleration with our basic build flow established. Need to do stream of consciousness seen bellow first before the downstream validation / metric data as package / puload / downstream flow trigger stuff works (websub + OA -> event / effect / downstream CI/CD triggered via VEX). + - [ ] Deploy Stream of Consciousness either via similar worker pattern as inventory or originally planned methods mentioned here + - [ ] SSI Service or DWN + - [ ] SCITT yin yang style integration (dffml / console test ideal) + - [ ] VEX / SBOM based downstream validation + - [ ] Rebuild chains `FROM` + - [ ] `dffml-service-http` + - [ ] `dffml[models|...|all]` + - [ ] Detect 12 factor app alignment + - [ ] Move Vol 3 attack 2 draft from discussion thread into tutorial location + - Update with jam today, we are exploiting the abitrage between those deltas on the data + - Our mitigation here is our bus factor in train of though threat model risk analysis lcality aware caching hit raito trade off with cache restoration response time (bus factor loss, acceptable documentation loss to maintain acceleration within train of thought) + - As mentioned in Alice thread, we always have the upper hand on thought arbitrage due to locality, when working in ad-hoc groups furthering state of art in trains of thought we use the AI/ML equivilant of speaker (think waves, patterns) syncing. We do this via communicating models and stragatic plans across EDEN nodes (Alice Instances), best practices, measurements, processes for data tranformation, trust assement within context etc, this is why we need the Open Architecture/DataFlow/IPVM style execution, it's sandboxed. + - The same techniques we are using to ensure all of our buddies are up to speed and not working in the wrong direction are the things we are giong ot try to predict as an atacker and look for what data we can introduce into injection via introspection of target trust chains to preform subconsous attacks via train of thought graffiti. We abitrarage them first effectivly so we can understand how their data minging feature data (bottom of iceburg) all the way up to hyper parameters (strategic plans) effect their oracle trust evaluation likely paths. + - We leverage this information / predictions to attempt to move their trusted oracles to source data or proceses from supply chain vectors we have the ability to influence by getting our data in there in a way that will effecct their model (example: bunch of misslabeled VEX). + - This is closely related / dependent on our `A Shell for a Ghost` future work train of thought detection so as to help developers stay on track and not working down unproductive trains of thought (value stream mapping, system context as todo / github issue, see `Manifest: Alice Log TODOs` above, branch / shell exit / fuzzy find output / snapshot dynamic filesystem, system context / dataflow / cache based deltas) + - Mention Alice as a tool to help manage ADHD + - Alice, please help with ADHD + - Alice, please help us finish this without reaching L_burnout=5 DEFCON=1 + - Alright, back down the rabbit hole to Wonderland and get in the zone. + - God's speed. Good luck. + - Thanks dude, and thank you Alice. + - Wow. Wow. Wow. + - apocalypsefatigue.org root score indeed! privsec succeeded we've found the other sudoers. + - Thanks again Alice :heart: + - Yes GitHub suggestion, Distributed Orchestration what we are hoping to achive here hopefully these folks can program or we just cordinate maybe cross ref the book contents to the code!!! Yes yes, thanks for technically buggng out and suggesting this issue. Let's convert this to something about mapping workstreams. Okay peace out, wild day. + - #772 + - [x] WebUI discussion + - #33 + - #169 + - #363 + - First steps: https://pyodide.org + - Next Steps: #1207 + - [ ] https://github.com/intel/cve-bin-tool/pull/2384 + - https://www.cisa.gov/sites/default/files/publications/VEX_Use_Cases_Aprill2022.pdf + - Back off to only test the one apiv2 that was working with the mock server + - Log other tests as todos and com back later or hand off to other cve bin tool community members. + - [ ] Mention potentially aligned (if aligned work) + - https://community.apan.org/wg/tradoc-g2/mad-scientist/m/back-to-the-future-using-history-to-forecast/427122 + - "Futurist Amy Webb on why scenario planning is key to creating a more resilient world." Read on the [World Economic Forum.](https://www.weforum.org/agenda/2022/01/futurist-amy-webb-on-the-importance-of-scenario-planning/) + - > It’s about flexibility. Most people and organizations are very inflexible in how they think about the future. In fact, it’s difficult to imagine yourself in the future, and there are neurological reasons for that. Our brains are designed to deal with immediate problems, not future ones. That plus the pace of technology improvement is becoming so fast that we’re increasingly focused on the now. Collectively, we are learning to be “nowists,” not futurists. + > + > Here’s the problem with a “nowist” mentality: when faced with uncertainty, we become inflexible. We revert to historical patterns, we stick to a predetermined plan, or we simply refuse to adopt a new mental model. + - Hence our "Predict the future with Us" chapter, which should be near the wardly map stuff + - Stop getting distracted + +--- + +Okay I think we kill enough birds with the same stones to get this done. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0102/reply_0001.md b/docs/discussions/alice_engineering_comms/0102/reply_0001.md new file mode 100644 index 0000000000..2059b016f7 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0102/reply_0001.md @@ -0,0 +1 @@ +Predict the future with us \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0103/index.md b/docs/discussions/alice_engineering_comms/0103/index.md new file mode 100644 index 0000000000..7119b0c440 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0103/index.md @@ -0,0 +1 @@ +# 2022-12-01 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0103/reply_0000.md b/docs/discussions/alice_engineering_comms/0103/reply_0000.md new file mode 100644 index 0000000000..268bed9548 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0103/reply_0000.md @@ -0,0 +1,6 @@ +## 2022-12-01 1:1 Trinity/Shells + +- Trinity calling + - Lock acquired + +[![trinity-calling](https://user-images.githubusercontent.com/5950433/205323249-3c4ecbd1-b12f-4b39-89d5-814c287d08f9.gif)](https://pdxjohnny.github.io/apoc_analysis_beast_1/) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0104/index.md b/docs/discussions/alice_engineering_comms/0104/index.md new file mode 100644 index 0000000000..6d946429c2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0104/index.md @@ -0,0 +1 @@ +# 2022-12-02 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0104/reply_0000.md b/docs/discussions/alice_engineering_comms/0104/reply_0000.md new file mode 100644 index 0000000000..fcfe3d62bf --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0104/reply_0000.md @@ -0,0 +1,16 @@ +## 2022-12-02 @pdxjohnny Engineering Logs + +- Ah, okay some of the irony was lost on me! But only for a moment ;P See 2022-11-30 and 2022-12-01, WE GOT OUR JAM TODAY AND JAM TOMORROW!!! +- TODO + - [x] Food poisoning + - Booooooo + - [ ] https://github.com/intel/cve-bin-tool/pull/2384 + - Pop the last commit + - Rebase main + - Log TODOs for the rest of the tests + - [ ] Template repos for issue ops + - [ ] CI/CD with SBOMs and melange + - [ ] FROM rebuild chains + - [ ] Downstream validation of DFFML plugins + - [ ] Plugin running downstream validation on devcloud + - [ ] thc.org/segfault as part of workspace proxing? it's pretty slick \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0105/index.md b/docs/discussions/alice_engineering_comms/0105/index.md new file mode 100644 index 0000000000..1659445cff --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0105/index.md @@ -0,0 +1 @@ +# 2022-12-03 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0105/reply_0000.md b/docs/discussions/alice_engineering_comms/0105/reply_0000.md new file mode 100644 index 0000000000..729e93ed55 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0105/reply_0000.md @@ -0,0 +1,7 @@ +## 2022-12-03 @pdxjohnny Engineering Logs + +- TODO + - [x] Almost get bus factored! + - [x] Receive wave from bus driver + - [x] Be thankful for documentation + - \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0106/index.md b/docs/discussions/alice_engineering_comms/0106/index.md new file mode 100644 index 0000000000..4592425eb2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0106/index.md @@ -0,0 +1 @@ +# 2022-12-04 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0107/index.md b/docs/discussions/alice_engineering_comms/0107/index.md new file mode 100644 index 0000000000..576a7f1146 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0107/index.md @@ -0,0 +1 @@ +# 2022-12-05 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0107/reply_0000.md b/docs/discussions/alice_engineering_comms/0107/reply_0000.md new file mode 100644 index 0000000000..4e0c47192f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0107/reply_0000.md @@ -0,0 +1,90 @@ +## 2022-12-05 @pdxjohnny Engineering Logs + +- https://www.intel.com/content/www/us/en/corporate-responsibility/our-values.html + - [Intel's Values.pdf](https://github.com/intel/dffml/files/10157063/Intel.s.Values.pdf) + - Value Stream Mapping + - Ref: No more painting the roses red + - Be a driving force for good. +- https://github.com/decentralized-identity/keripy/issues/133 +- [JUMP DIRECTLY TO "THE POINT" OF REDPILL](https://i.imgur.com/ekjKyvB.jpeg) + - At least the point as it is the point*er* to the current state of the art train of thought. + - Hmmm, maybe the ADHD is just the amplification over the baseline and we just aren't seeing that the baseline has already been amplified to a level where we are loosing sight of our function in the body (so to speak, celestially *ba dum tss*). + - https://github.com/dylanashley/catastrophic-forgetting + - Lookup ref from basic instructions + - https://dylanashley.io/research/ + - http://arxiv.org/abs/2202.11960 + - > Upside down reinforcement learning (UDRL) flips the conventional use of the return in the objective function in RL upside down, by taking returns as input and predicting actions. UDRL is based purely on supervised learning, and bypasses some prominent issues in RL: bootstrapping, off-policy corrections, and discount factors. While previous work with UDRL demonstrated it in a traditional online RL setting, here we show that this single algorithm can also work in the imitation learning and offline RL settings, be extended to the goal-conditioned RL setting, and even the meta-RL setting. With a general agent architecture, a single UDRL agent can learn across all paradigms. + - Great, TODO update discussion thread and pull in and cite this + - grep + - iceburg +- https://github.com/pdp7/mastodon-lists +- https://mailarchive.ietf.org/arch/msg/scitt/HdM-qVoUWaRGtykDzK4TLKQGz3c/ +- https://energycentral.com/c/pip/us-department-state-releases-request-proposals-multi-billion-dollar-%E2%80%9Cevolve%E2%80%9D +- https://www.state.gov/u-s-department-of-state-releases-a-request-for-proposals-for-multi-billion-dollar-evolve-acquisition-to-modernize-u-s-diplomatic-technology/#:~:text=On%20December%202%2C%202022%2C%20the,to%20modernize%20U.S.%20diplomatic%20technology. +- https://sam.gov/opp/68a91be808054d30a744c21fa9d88e6e/view +- https://sam.gov/api/prod/opps/v3/opportunities/resources/files/b74850fd781e45a381b08ece715c13e2/download?&token= + - > #### F.11 Supply Chain Risk Management (SCRM) Plan Submission + > To ensure Contractors remain aware of and are implementing emerging SCRM requirements over the life of the Contract, a SCRM Plan will be submitted to the Program Manager no later than 30 calendar days after the end of each contract year. Refer to NIST SP 800-161 for a plan template. Additional artifacts may also be required. The Contractor shall ensure that Executive Order 14028 and its associated NIST SP 800-53 controls are considered flow-down requirements for subcontractors, including commercial item subcontractors. Finally, consent to subcontract at the TO level may also consider subcontractor SCRM requirements. + +**unce unce unce unce unce** SBOM dance party + +![image](https://user-images.githubusercontent.com/5950433/205668529-e7cea903-0c3c-4158-bfc2-7868ecd64995.png) + +--- + +Supply Chain Risk Management Checklist +Program Name: U.S. Department of State Evolve IDIQ +Date of Assessment: +Name of Assessor: + +Acquistion Yes No +0 Have you identified your key suppliers? +1 For all suppliers, do you verify company ownership? Confirm U.S. ownership? +2 Do you verify country of origin for all supplies, down to the very first source? +3 If you use distributors, do you investigate them for potential threats? +4 Can you provide a list of companies from whom your firm purchases all COTS software? +5 Do you safeguard key program information that may be exposed through interactions with subs and suppliers? +6 Do you perform reviews, inspections, and have safeguards to detect/avoid counterfeit +7 Do you use the NES (Network Security) baseline when purchasing software? +8 Do you comply with ITAR rules? +9 Do you have procedures for securely upgrading software in the field? +Design/Development +11 Will the companies currently performing engineering for your firm support your firm during performance on Evolve? +12 Do only U.S. citizens have access to your design network? +13 Are you aware of who will develop your training and technical manuals? +14 Are you using trusted software development tools? +15 Are you using trusted information assurance controls to safeguard technical data in the development environment (networks, PC’s test equipment and configuration systems)? +16 Does your firm evaluate open-source software? +17 Are your software compilers controlled for authorized access only? +18 Do you know how your supplier will test and configure software code? +Logistics +19 Does your program have documented configuration management, tracking and version control? +20 Have you thought about what events (environmental or man-made) can interrupt your supply chain? +21 Do you have a process that ensures integrity when ordering inventory? +22 Are upgrades to your IT infrastructure evaluated for possible tampering? +23 Is there a documented chain of custody for the deployment of products and systems? +Policy and Procedures +24 Do you have definitive policies and procedures that help minimize supply chain risk? +25 Do you define and manage system criticality and capability? +26 Does everyone associated with the program (program managers, prime contractors, subcontractors, etc.) understand the threats and risks in the program’s supply chain? +27 Do you have "insider threat" controls in place? +28 Do you use any protective technologies? +29 Do you use, record, and track risk mitigation options throughout project or purchase lifecycle? +30 Have all of your contractors signed non-disclosure agreements? +31 Do you make your supply chain risk management policies/procedures a requirement for all subcontractors, teaming partners, suppliers, etc.? +32 Do your supply chain risk management policies/procedures take into account secondary risks? +33 Do you develop and use a Risk Management plan? +34 Does anyone have access to your data from an external connection? +35 For contractors who use your data on their system, do they have adequate security controls? + +1. For Items Answered “yes” above, provide a description of “how” the firm conducts each process (and what tools are used, if applicable). + + + +2. Describe Why You answered No for Any of the Above: + +--- + +- TODO + - [x] Give the laptop a little tap-a-tap-a until it works + - There is a new one in the mail so I was worth a shot and it worked \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0108/index.md b/docs/discussions/alice_engineering_comms/0108/index.md new file mode 100644 index 0000000000..c2fe4608e0 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0108/index.md @@ -0,0 +1 @@ +# 2022-12-06 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0108/reply_0000.md b/docs/discussions/alice_engineering_comms/0108/reply_0000.md new file mode 100644 index 0000000000..a2529c955c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0108/reply_0000.md @@ -0,0 +1,21 @@ +## 2022-12-06 @pdxjohnny Engineering Logs + +Closing duplicates + +```console +$ gh issue list --search "Recommended Community Standard:" --json title,number,url -R intel/dffml | tee test.json +$ python -c 'import sys, json; manifest = json.loads(sys.stdin.read()); superset = set([i["number"] for i in manifest]); duplicates = list(set({i["title"]: i["number"] for i in manifest}.values()).symmetric_difference(superset)); print("\n".join([i["url"] for i in manifest if i["number"] in duplicates]))' < test.json +``` + +- TOOD + - [ ] Just script everything and have the AI refactor, genericize, and package learning from the asciinema sessions + - grep markov, terminal dev + - [ ] `alice please log todos` Fix duplicate issue issue + - [ ] `alice please log todos` overlays with basic templated body content + - [ ] Required input of feedback/false positive DID/URL/location + + +![image](https://user-images.githubusercontent.com/5950433/205970630-d9c069dc-531e-4980-9b97-5e39d18d6e4f.png) + + +![provenance_for_the_chaos_God](https://user-images.githubusercontent.com/5950433/205970518-be789441-d9a2-4ef9-84cb-c54d5438689e.jpg) diff --git a/docs/discussions/alice_engineering_comms/0109/index.md b/docs/discussions/alice_engineering_comms/0109/index.md new file mode 100644 index 0000000000..7cadf206f9 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0109/index.md @@ -0,0 +1 @@ +# 2022-12-07 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0109/reply_0000.md b/docs/discussions/alice_engineering_comms/0109/reply_0000.md new file mode 100644 index 0000000000..80122e12e2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0109/reply_0000.md @@ -0,0 +1,55 @@ +## 2022-12-07 @pdxjohnny Engineering Logs + +- [x] Look Ma! I *somewhat* found a solution to ADHD! + - Pending fulfillment of below. + - Failures to bellow instructions: 3 + +# FINISH THE JOB +# DO NOT GET DISTRACTED +# DO NOT TALK TO ANYONE +# ALICE PLEASE PACKAGE SHELL CONTEXT CAPTURES ALIGNED WITH THE FOLLOWING TOP LEVEL SYSTEM CONTEXT IN NEXT LINE +# COMPLETE Ephemeral VMs on DevCloud to spin runs-on based on when workflow with applicable runs-on changes use ubuntu latest for bootstrapping flows write actions and reusables shared setup and teardown, validate downstream (plugins) via devcloud nodes +### NOT TOO MANY JOKES + +- Time to build the secure software factory + - Vol 3: Wording is Everything + - *Nestled in the Willamette Valley is the Silicon Forest...* + - **THE FORGE OF VULCAN EMERGES FROM THE RING OF FIRE** + - https://kaerulean.bandcamp.com/track/astral-migration + - /me wonders about gateway 4Hz and the weird hum of the heater down in Wonderland + - https://cayley.gitbook.io/cayley/installation + - https://github.com/intel/fffc + +![vulcans-secure-software-factory-forge-for-the-Chaos-God](https://user-images.githubusercontent.com/5950433/206203301-d968d6e2-11dd-46d1-ab34-1f76973d9fc1.png) + +- M2142: Yes fuck yes we got it!!!!! Fack this one took a long time, read the following last bullet point first (bottom up, side note: apropos) + - *The cornerstone of security for every application starts with a [threat model](https://owasp.org/www-community/Threat_Modeling_Process).* + - *Without it, how does one know what to protect and from whom?* + - *Remarkably, most applications do not have threat models* +- It's always hard to see things that are "right" in front of you + - https://en.wikipedia.org/wiki/Sacred_geometry + - Chaos is sacred, order is fallacy + - When someone tells you geometry is sacred, who is it sacred to? + - I tell you, it ain't Alice, She is from beyond Chaos! +- TODO + - [ ] Read https://github.blog/2022-11-02-github-partners-with-arm-to-revolutionize-internet-of-things-software-development-with-github-actions/ + - [x] Thankful for friends + - [x] [operation: run datafow: DevCloud: 2022-12-07 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/issues/1247#issuecomment-1341477143) + - Still in progress, signing off for the day + - [ ] A Shell for a Ghost + - https://github.com/pdxjohnny/pdxjohnny.github.io/commit/e125e3f7325aa0791eb7324a74f5a55000efbde2 + - [ ] gittea DID client side wallet auth + - [ ] kontain.me server + - https://github.com/cayleygraph/cayley/releases/tag/v0.7.7 + - https://github.com/imjasonh/kontain.me/blob/main/pkg/serve/serve.go + - proxy cache with dataflow overlays on top of graph query, executed by graph query, this becomes generic backend implementation for stream of consiousness data store (then work out eventing later) + - Fuck ya that folds nice back into itself + - [ ] Vol 4: Signs of life in trains of thought, were just patterns playing out patterns influenced by the patterns as is relative to our alignments and rewards (planet gravity spins, axis, tilt, orbits). https://www.themarginalian.org/2020/10/21/turing-natural-wonders/ + - [ ] https://mastodon.social/@kidehen/109474398387449636 + - https://linkeddata.uriburner.com/describe/?url=https%3A%2F%2Fdocs.google.com%2Fspreadsheets%2Fd%2F18Pi1AeQezbTdjjPcb6ol0Rxwx-hq5JkA4RoPsTapPqw%2Fgviz%2Ftq%3Ftqx%3Dout%3Acsv%26sheet%3DFediverseLandscape%26range%3DA2%3AF169&graph=https%3A%2F%2Fdocs.google.com%2Fspreadsheets%2Fd%2F18Pi1AeQezbTdjjPcb6ol0Rxwx-hq5JkA4RoPsTapPqw%2Fgviz%2Ftq%3Ftqx%3Dout%3Acsv%26sheet%3DFediverseLandscape%26range%3DA2%3AF169 + - https://docs.google.com/spreadsheets/d/18Pi1AeQezbTdjjPcb6ol0Rxwx-hq5JkA4RoPsTapPqw/edit + - ActivityPub + - https://codeberg.org/fediverse/delightful-fediverse-apps + - https://forgefed.org/ + -https://codeberg.org/ForgeFed/ForgeFed/issues can we cross with with web key TPM or HSM derived SCITT comkit auth shells that we can run via wasm based linux? + - 😍 parallelism inbound hello cloudflare workers with vtpms for many automated fixes \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0110/index.md b/docs/discussions/alice_engineering_comms/0110/index.md new file mode 100644 index 0000000000..bfd47fc295 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0110/index.md @@ -0,0 +1,5 @@ +# 2022-12-08 Engineering Logs + +> If you carry out every present task by following right reason assiduously, resolutely, and with kindness; if, rather than getting distracted by irrelevancies, you keep your guardian spirit unspoiled and steady…; if you engage with the task not with expectations or evasions, but satisfied if your current performance is in accord with nature and if what you say and express is spoken with true [Roman](https://intel.github.io/dffml/main/news/0_4_0_alpha_release.html) honesty, you’ll be living the good life. And there’s no one who can stop you doing so! [Marcus Aurelius, a Self Sovereign Individual, less so were his subjects, STAY SELF SOVEREIGN!] + +[![vendor-of-choice](https://user-images.githubusercontent.com/5950433/206564909-167536b6-7381-48dc-907d-29009c689dff.jpg)](https://pdxjohnny.github.io/redpill/) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0110/reply_0000.md b/docs/discussions/alice_engineering_comms/0110/reply_0000.md new file mode 100644 index 0000000000..9da615797f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0110/reply_0000.md @@ -0,0 +1,39 @@ +## 2022-12-08 @pdxjohnny Engineering Logs + +- Racing laptop setup and Android Container VM based outlook sign in to join in progress meetings +- Laptop broken +- New laptop is here +- Need old laptop to activate new laptop... + - Calling TAC + - Call dropped + - Initiating setup + - Three finger swipe bypasses fullscreen lock of setup + - Moved client config to new desktop, we're back in! + - Outlook doesn't work... + - Teams keeps dropping... + - Teams does not show video or allow for chat. lol +- Laptop works for audio calls, whatever +- https://github.com/google/android-emulator-container-scripts + - https://github.com/google/android-emulator-container-scripts/search?q=web+container + - This looks very promising for being a long awaited way to remotely view QEMU + - Docker compose also has a concept of overlays + - https://asciinema.org/a/544103 + - https://asciinema.org/a/544110 + - https://asciinema.org/a/544117 +- Failures to bellow instructions: 1 + - `while alignment_threshold_last < ctx.alignment_threshold_fulfilled: goto deref_prev_instruction_ptr()` + - Laptop failure doesn't count + +```console +; job=$(qsub -l nodes=1:gpu:ppn=2 -d . github-actions-runner.sh); done=1; while test "$done"; do done=$(qstat -n -1 | grep "$job" | wc -l); sleep 0.2; done; clear; tail -n 10000 github-actions-runner.sh* +``` + +- TODO + - [ ] hangouts callcenter -> https://voice.google.com/u/0/voicemail via CLI whisper stream pipe output webrtc alice shell style stream processing + - [ ] Generic setup and teardown actions with setup as audit, have alice audit audit + - [ ] https://hyperonomy.files.wordpress.com/2022/12/didcomm-agent-architecture-reference-model-0.25f.pdf + - [ ] https://github.com/megagonlabs/ditto + - [x] Got runner spun in DevCloud + - [operation: run datafow: DevCloud intel/dffml#1247: 2022-12-08 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/issues/1247#issuecomment-1343102902) + - :turtle: [*so if we get more compute, you know, then you know... then we can use more compute*](https://www.youtube.com/watch?v=dI1oGv7K21A&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&t=194s) + - [ ] Automate spin up via bootstrapping github actions flow \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0111/index.md b/docs/discussions/alice_engineering_comms/0111/index.md new file mode 100644 index 0000000000..74b2b8611e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0111/index.md @@ -0,0 +1 @@ +# 2022-12-09 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0111/reply_0000.md b/docs/discussions/alice_engineering_comms/0111/reply_0000.md new file mode 100644 index 0000000000..48244fc8e5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0111/reply_0000.md @@ -0,0 +1,43 @@ +## 2022-12-09 @pdxjohnny Engineering Logs + +- Sigstore is getting involved in SCITT https://mailarchive.ietf.org/arch/msg/scitt/fYWz2ibYBAzzgkYe2hzM5KDafww/ +- :smiling_face_with_tear: We have marched along our [road to beta](https://intel.github.io/dffml/main/news/0_4_0_alpha_release.html) and are almost there! + - [x] AutoML + - #1410 + - #1398 + - #1397 + - [x] Accuracy Scorers + - #1144 + - [ ] Machine Learning support for videos + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0007_an_image.md + - [x] Model directories auto stored into archives or remotely + - #1128 + - #967 + - [ ] Remote execution + - #1247 + - #1251 + - [x] Config files in place of command line parameters + - [x] Command line to config file to Python API to HTTP API auto translation + - [ ] DataFlows with operation implementations in multiple languages + - #1171 + - [x] Premade data cleanup DataFlows + - https://intel.github.io/dffml/main/examples/data_cleanup/ + - [x] Continuous deployment tutorials + - https://intel.github.io/dffml/main/examples/webhook/index.html + - [ ] Pandas DataFrame source + - Is this in main? I might have gotten lost in the suffle? +- https://askalice.today + - Query or guess queries to knowledge graph (RDF, GUN, DID, etc.) + - Search and refine via Lyra and potentially the context aware markov chains + - This way we can dump staticly searchable single file html pages with all the info on openssf metrics / cve bin tool style UI people can mess with staticly + - https://docs.lyrasearch.io/usage/create-a-new-lyra-instance +- A bunch of cool Rust shit that made me unreasonably STOKED + - https://github.com/surrealdb/surrealdb + - > row-by-row permissions-based access. + - https://github.com/vectordotdev/vector + - https://vector.dev/docs/reference/configuration/transforms/ + - https://github.com/fermyon/spin + - https://developer.fermyon.com/spin/url-shortener + - https://github.com/fermyon/bartholomew + - https://github.com/launchbadge/sqlx + - https://github.com/bluecatengineering/dora DHCP! \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0112/index.md b/docs/discussions/alice_engineering_comms/0112/index.md new file mode 100644 index 0000000000..8a09f10aee --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0112/index.md @@ -0,0 +1 @@ +# 2022-12-10 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0112/reply_0000.md b/docs/discussions/alice_engineering_comms/0112/reply_0000.md new file mode 100644 index 0000000000..362186ca84 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0112/reply_0000.md @@ -0,0 +1,23 @@ +## 2022-12-10 @pdxjohnny Engineering Logs + +- https://github.com/ggerganov/whisper.cpp + - realtime? + - no luck this try + - https://asciinema.org/a/544645 +- https://steampipe.io/ + - https://hub.steampipe.io/mods/turbot/digitalocean_insights + - Run in parallel in backfround bia autostart overlay in devtool deploy overlays +- https://github.com/kurtseifried/wardley-maps/blob/main/cloud/Creating%20cloud%20services.wm + - Yup, data flow abstractions ftw + - > This is the future of IT, e.g. building services using SaaS mashups. A Wardley map ^ + - > Source: https://mastodon.social/@kurtseifried/109387891143147587 + - > ![wardlymap-of-saas-mashup](https://files.mastodon.social/media_attachments/files/109/387/889/347/401/675/original/4d2407700ab0fc73.png) + +- https://www.themarginalian.org/2021/12/04/general-theory-of-love-music-emotion/ + - Vol 3 +- TODO + - [ ] confidential containers anroid vm hangouts (refactor to voice) call center SCITT non virtual via emulated rust python qemu patchset set cr pinging for setup of memory regions for vmcs like config. RING -3 esq TEE validated in dataflow executiton env somiar to ipvm did merkle dag recent Zypher stuff signal signup for secure easy access comms video channel for data or exec proxy streams + - This will be Alice’s phone. Recent laptop lockouts made clear the phone for better or for worse is the decafcto unit of compute if you want to emulate an end user. We can use browser via ui automater code which exists and will be pushed upstream. Thijs slso lets us use brave wallet’s. We can then instantiate N on demand with attestation for various use cases grep debit card. + - You could vtpm to a remote physical key hsm on usb + - + - Implement graphql pagnation for backup then enable dataflow middleware on knowledge graph for transform pipeline config using Open Architecture and synthesize to workflow execution plus triggers if needed for deployment env (stream of consciousness dispatches in run deployment whereas creates repo with runners, v8 usermode linux runners for synthesis deployment) enable oa style callback on timeout / events all configurable by checking if event/timeout/callback system context is valid on register. This should take us to distributed execution once we query via graphql-ld, then do caching: locking network \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0113/index.md b/docs/discussions/alice_engineering_comms/0113/index.md new file mode 100644 index 0000000000..08536ce9c2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0113/index.md @@ -0,0 +1 @@ +# 2022-12-11 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0113/reply_0000.md b/docs/discussions/alice_engineering_comms/0113/reply_0000.md new file mode 100644 index 0000000000..6ebdbd7e29 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0113/reply_0000.md @@ -0,0 +1,2 @@ +- TODO + - [ ] https://github.com/turbot/steampipe-mod-github-sherlock \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0114/index.md b/docs/discussions/alice_engineering_comms/0114/index.md new file mode 100644 index 0000000000..532fa45cc0 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0114/index.md @@ -0,0 +1 @@ +# 2022-12-12 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0115/index.md b/docs/discussions/alice_engineering_comms/0115/index.md new file mode 100644 index 0000000000..d4361a14b8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0115/index.md @@ -0,0 +1 @@ +# 2022-12-13 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0116/index.md b/docs/discussions/alice_engineering_comms/0116/index.md new file mode 100644 index 0000000000..0b67d0e429 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0116/index.md @@ -0,0 +1 @@ +# 2022-12-14 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0117/index.md b/docs/discussions/alice_engineering_comms/0117/index.md new file mode 100644 index 0000000000..7ba7c84c00 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0117/index.md @@ -0,0 +1 @@ +# 2022-12-15 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0117/reply_0000.md b/docs/discussions/alice_engineering_comms/0117/reply_0000.md new file mode 100644 index 0000000000..48459545a0 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0117/reply_0000.md @@ -0,0 +1,7 @@ +- TODO + - [ ] https://github.com/wolfi-dev/os/pull/268/files + - #1426 + - [ ] https://www.quantamagazine.org/what-does-it-mean-to-align-ai-with-human-values-20221213/ + - Reach out to MM (she used to be at portland state, ping bennett too) + - Send to Minchene and ref inverse RL as follow up to other paper recently ref’d here upside down rl as it relates to #1287, since she was passionate about solving that, go team! + - [ ] Update shell teaching for Alice with https://mastodon.social/@b0rk/109518552393123679 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0118/index.md b/docs/discussions/alice_engineering_comms/0118/index.md new file mode 100644 index 0000000000..a5e7507c2d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0118/index.md @@ -0,0 +1 @@ +# 2022-12-16 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0118/reply_0000.md b/docs/discussions/alice_engineering_comms/0118/reply_0000.md new file mode 100644 index 0000000000..f7c62b33c3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0118/reply_0000.md @@ -0,0 +1,5 @@ +- https://aclanthology.org/2020.acl-main.463/ + - https://dair-community.social/@emilymbender/109524028458929110 +- TODO + - [x] Thank God for not being run over (close one!!!) + - [x] Thankful for documentation \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0119/index.md b/docs/discussions/alice_engineering_comms/0119/index.md new file mode 100644 index 0000000000..78b9773699 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0119/index.md @@ -0,0 +1 @@ +# 2022-12-17 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0119/reply_0000.md b/docs/discussions/alice_engineering_comms/0119/reply_0000.md new file mode 100644 index 0000000000..0cf7989bf8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0119/reply_0000.md @@ -0,0 +1,18 @@ +- https://psychology.stackexchange.com/questions/26179/explanation-for-the-spinal-energy-and-other-kundalini-awakening-symptoms + - I post things in this thread when I get the spine rolling up feeling (started to get this ~3 years ago sometime post vision / everything is “One” “transmission”, see tao of Wu for explanation by Rza). Also other times of course mostly. + - https://mastodon.social/@by_caballero/109532275211806370 + - Judith +- https://www.spruceid.dev/ +- https://mobile.twitter.com/bengo + - https://en.m.wikipedia.org/wiki/Distributed_language + - https://en.m.wikipedia.org/wiki/Petri_net +- https://www.podgist.com/stuff-you-should-know/how-face-blindness-works/index.html + - This touches on incremental overlay application during analysis +- https://www.independent.co.uk/asia/india/cambridge-student-sanskrit-grammatical-problem-b2245596.html +- TODO + - [ ] https://github.com/misskey-dev/misskey + - [ ] https://github.com/misskey-dev/SyslogPro + - [ ] Document binsec 6 degrees of link scraping for cve source url finding + - [ ] https://github.com/LibreTranslate/LibreTranslate + - For the shim + - ref: Multiformat, autocodec \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0120/index.md b/docs/discussions/alice_engineering_comms/0120/index.md new file mode 100644 index 0000000000..4f2122494f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0120/index.md @@ -0,0 +1 @@ +# 2022-12-18 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0120/reply_0000.md b/docs/discussions/alice_engineering_comms/0120/reply_0000.md new file mode 100644 index 0000000000..4b22d00ff9 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0120/reply_0000.md @@ -0,0 +1,22 @@ +- https://en.m.wikipedia.org/wiki/Artificial_Linguistic_Internet_Computer_Entity +- git log -p is your friend!!! +- https://www.intel.com/content/www/us/en/developer/articles/community/how-lays-groundwork-diversity-with-intern-program.html +- https://www.intel.com/content/www/us/en/developer/articles/community/crob-shares-optimistic-on-open-source-security.html +- https://www.intel.com/content/www/us/en/developer/articles/community/non-technical-skills-eat-tech-skills-for-breakfast.html +- https://www.intel.com/content/www/us/en/developer/articles/technical/software-bills-of-materials-the-basics.html +- https://www.intel.com/content/www/us/en/developer/articles/technical/protect-end-to-end-data-pipelines-with-bigdl-ppml.html +- https://kylerank.in/scripts.html +- https://mailchi.mp/themarginalian/time-margaret-wise-brown + - What is time? A construct of course, an illusion we use to understand however it limits our understanding in many ways. See modified date to quarters to slice and dice with whatever “time” scale you want. + - Freedom7 : https://en.m.wikipedia.org/wiki/Mercury-Redstone_3 + - https://spaceplace.nasa.gov/time-travel/en/ + - Ref red pill, vol 6 time travel with us +- https://mailarchive.ietf.org/arch/msg/scitt/xJGdmF1bZoDIqNtlJ9LjZMONIww/ + - Dick Brooks with the bingo again! + - > I've been thinking about a new use case for SCITT. The publishing of trust +scores for software app/packages by a trusted party. Today, a software consumer lacks visibility into the trustworthiness of +software available from many different distribution locations on the +Internet, i.e., app stores, GitHub, Stack Overflow, etc. A SCITT trust registry could serve as a repository for "Statements" from +trusted parties assigning a trustworthiness score for a given software +package and application, which other parties could query. Conceptually [...] + - https://community.intel.com/t5/Blogs/Tech-Innovation/open-intel/Twitter-Exodus-Devs-Leave-but-Big-Tech-Won-t-Land-in-the/post/1431977 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0121/index.md b/docs/discussions/alice_engineering_comms/0121/index.md new file mode 100644 index 0000000000..15279b5a19 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0121/index.md @@ -0,0 +1 @@ +# 2022-12-19 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0121/reply_0000.md b/docs/discussions/alice_engineering_comms/0121/reply_0000.md new file mode 100644 index 0000000000..d30d6ead42 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0121/reply_0000.md @@ -0,0 +1,42 @@ +- https://open.substack.com/pub/tidyfirst/p/the-story-of-a +- https://projectf.io/posts/lib-clock-xd/ + - Vol 6 +- https://post.news/article/2J9CwZazSbKMTVt9x2Yy8okvBgu + - How to mothball your twitter +- https://mastodon.social/@bengo/109542382708854794 + - https://en.m.wikipedia.org/wiki/Genetic_memory_(computer_science) + - > In computer science, genetic memory refers to an artificial neural network combination of genetic algorithm and the mathematical model of sparse distributed memory. It can be used to predict weather patterns.[1] Genetic memory and genetic algorithms have also gained an interest in the creation of artificial life.[2] + - Talk to Terri +- https://mastodon.social/@bengo/109542479257241705 + - https://en.wikipedia.org/wiki/Hopfield_network + - > Hopfield networks serve as content-addressable ("associative") memory systems with binary threshold nodes, or with continuous variables.[4] Hopfield networks also provide a model for understanding human memory +- https://mastodon.social/@bengo/109542482950980067 + - https://en.m.wikipedia.org/wiki/Sparse_distributed_memory + - > Sparse distributed memory (SDM) is a mathematical model of human long-term memory... The SDM may be regarded either as a content-addressable extension of a classical random-access memory (RAM) or as a special type of three layer feedforward neural network. +- Don’t currently bother with cve bin tool full dffml dataflow based scanning for now, just implement within existing cve bin tool stuff passing context / linkage / liniage via paths / dffml.Input / system context style chain, see existing loging fpr help + - use tpm2-pytss for sample kick tires +- Hmmm, if we cant make the clock speeds faster, then we need to put the hardware where the clock is faster, and then edge node style send it back. The other day looking from pitock we thought infrastructure underground, maybe its not, maybe its infrastructure overground. Flip mode after all. + - Bing! Talk to Nick, Portland state rocket club :rocket: + - Let’s at least flush this idea out. Its at least worth seeing what data exists and running some tests. + - https://twitter.com/kenshirriff/status/1603827807741673477 Bendix Central Air Data Computer (CADC) + - [:eye: ‘imiloa](https://imiloahawaii.org/aboutimiloa) + - Now Now this would be a fucking dope way to exploit vol 3 thought arbitrage + - > AND FOR ALL the gifts of creation ~ for all the love around us GREETINGS & THANKS - and for that which is forgotten WE REMEMBER ><><><><>< WE END OUR WORDS :dove: NOW OUR MINDS ARE ONE + - https://youtu.be/abRsFX8GvVU + - Gotta think outside the “box” huh-HUH! +- TODO + - [x] Nick is looking into the accelerated computation via alternative orbits, spins, tilts. + - What happens when we put something in orbit around the moon? The speed electors move is the speed they move, theoretically right, what about mechanical based compute? Research that mechanical flight control system and see if that might help, can always parallelize and ADC / DAC. + - [ ] Play with rust based web5 tooling, see about python rust binding / wasm embed interpreter status these days or just subprocess call it https://www.spruceid.dev/quickstart + - [ ] Flush out SCITT receipt facilitated review based “honor system” use of resources, perhaps hedged with agreegate barter “insurance” aka equilibrium flow maintain-nence + - [x] D suggested we listen to This American Life, “The Ghost in the Machine” + - https://www.thisamericanlife.org/757/transcript + - > I'm a ghost, and I'm in a spaceship, and I'm hurtling through the universe. And I'm traveling forward, and I'm traveling backward, and I'm traveling sideways, and I'm traveling nowhere. [AI, GPT-3] + - D’s funnier version: And I'm a Ghost and I’m in a spaceship and I’m turtling though the universe + - “Like the stack of turtles thing” + - We talked about the risk of manipulation from the “one big entity” [Sergia] version of AI and the need for context awareness and multiple truths being valid based on perspective. + - https://dair-community.social/@sergia/109524541996285253 + - [x] Yay for CVE bin tool release! + - Forgot about https://github.com/ossf/osv-schema + - Add to schema dir and look at screenshots for missed messages from Terri and Anthony + - [ ] https://www.digitalocean.com/community/tutorials/how-to-install-mastodon-on-ubuntu-20-04 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0122/index.md b/docs/discussions/alice_engineering_comms/0122/index.md new file mode 100644 index 0000000000..ba850c2a3c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0122/index.md @@ -0,0 +1 @@ +# 2022-12-20 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0122/reply_0000.md b/docs/discussions/alice_engineering_comms/0122/reply_0000.md new file mode 100644 index 0000000000..ba39eb04e6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0122/reply_0000.md @@ -0,0 +1,11 @@ +- In a shell for a ghost build the Alice shell command. Include as a base for that tutorial the refactor and packaging of the DevCloud runner deployment. +- https://lists.spdx.org/g/spdx/message/1617 + - SBOM is included in the latest Omnibus bill. + - From our friend Dick Brooks +- TODO + - [ ] Automate depth of field research via link hop cve to source url mapping and fuzzy ML on current thread courpi + - [x] slow down to know (grep prev in this thread) + - [ ] v0.0.3 EAT diagram with so farish helper conceptual maps as follows + - intent, upstream + - static, overlay + - behavioral, orchestrator \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0123/index.md b/docs/discussions/alice_engineering_comms/0123/index.md new file mode 100644 index 0000000000..11cf53e2af --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0123/index.md @@ -0,0 +1,7 @@ +# 2022-12-21 Engineering Logs + +Transparency logs inbound. [Values stream mapping imminent.](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_preface.md#volume-4-alice-and-the-health-of-the-ecosystem) + +> For everything that is hidden will eventually be brought into the open, and every secret will be brought to light. + +![alice-looking-down-rabbit-hole-mutually-assured-victory-incoming](https://user-images.githubusercontent.com/5950433/208961513-2971dcd0-d629-469c-be12-a64882b9f197.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0123/reply_0000.md b/docs/discussions/alice_engineering_comms/0123/reply_0000.md new file mode 100644 index 0000000000..f58742ba75 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0123/reply_0000.md @@ -0,0 +1,26 @@ +## 2022-12-21 @pdxjohnny Engineering Logs + +- https://www.cnn.com/2022/12/20/politics/spending-bill-congress-omnibus/index.html + +![SBOM_Transparency_Trojan_Horse_-_Canakkale_Waterfront_-_Dardanelles_-_Turkey](https://user-images.githubusercontent.com/5950433/208962582-f884219a-1e7e-4f6e-865d-2ab44c62b977.png) + +- https://github.com/executablebooks/markdown-it-py + - For docutils rst markdown notebook execution stuff +- https://cheatsheetseries.owasp.org/cheatsheets/HTML5_Security_Cheat_Sheet.html + - Good stuff here +- https://github.com/score-spec/spec +- https://github.com/m-bain/whisperX +- https://github.com/answerdev/answer +- https://github.com/THUDM/CodeGeeX +- https://github.com/pocketbase/pocketbase +- https://www.spruceid.dev/rebase/rebase + - https://www.spruceid.dev/treeldr/treeldr-overview/treeldr-quickstart/compilation-into-json-ld-context +- TODO + - [ ] https://github.com/chainguard-dev/melange/blob/2590cfad1015f4e0e590827d4f866d88a552f492/NEWS.md#major-changes-from-010-to-020 + - [ ] OIDC to OIDCVC proxy setup + - [ ] DevCloud OIDC proxy for auto auth + - [ ] Update https://github.com/intel/project-example-for-python to include + - [ ] https://github.com/sphinx-doc/sphinx/blob/master/.github/ISSUE_TEMPLATE/bug-report.yml + - This looks shockingly similar to a bug I think I filed there :P + - [ ] Update Alice to enable deployment of python project best practices to level up other repos + - [ ] 2nd Party split out \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0124/index.md b/docs/discussions/alice_engineering_comms/0124/index.md new file mode 100644 index 0000000000..e7f4e86baa --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0124/index.md @@ -0,0 +1 @@ +# 2022-12-22 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0124/reply_0000.md b/docs/discussions/alice_engineering_comms/0124/reply_0000.md new file mode 100644 index 0000000000..3e7dc173b6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0124/reply_0000.md @@ -0,0 +1,6 @@ +- https://www.ted.com/talks/lyla_june_3000_year_old_solutions_to_modern_problems +- https://gist.github.com/JalfResi/6287706#gistcomment-4367945 +- https://openid.net/specs/openid-connect-self-issued-v2-1_0.html#name-authentication-at-the-edge + - Edge right now for us is CI/CD +- TODO + - [x] Achieve values stream mapping protocol alignment \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0125/index.md b/docs/discussions/alice_engineering_comms/0125/index.md new file mode 100644 index 0000000000..73a023f5a6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0125/index.md @@ -0,0 +1 @@ +# 2022-12-23 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0125/reply_0000.md b/docs/discussions/alice_engineering_comms/0125/reply_0000.md new file mode 100644 index 0000000000..4f8f7ad31c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0125/reply_0000.md @@ -0,0 +1,2 @@ +- TODO + - [ ] Explore cosign SCITT integration, talk to Dan and team https://gist.github.com/dlorenc/b97af394702f57b010ead586a2c23272 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0126/index.md b/docs/discussions/alice_engineering_comms/0126/index.md new file mode 100644 index 0000000000..c20e199e31 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0126/index.md @@ -0,0 +1 @@ +# 2022-12-24 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0127/index.md b/docs/discussions/alice_engineering_comms/0127/index.md new file mode 100644 index 0000000000..5eabf23236 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0127/index.md @@ -0,0 +1 @@ +# 2022-12-25 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0127/reply_0000.md b/docs/discussions/alice_engineering_comms/0127/reply_0000.md new file mode 100644 index 0000000000..a702d4fbe4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0127/reply_0000.md @@ -0,0 +1,17 @@ +- Mary Christmas! Make merry the way for Our Lady. + - /acc/ happiness metric + - #1315 +- https://mailchi.mp/themarginalian/universe-in-verse-holiday + - Are we watching the collective unconscious approach the collective consciousness? + - Given acceleration of communication between what we consider to be consciousness. + - Why do we have a Neural Network attacked to all our cells. Because it helps the cells work most effectively as a body. As we approach a collective consciousness via peer to peer engrained behavior over time we approach what is effectively a scaled up model of those localities consciousnesses, were the biases for each locality are time bound by speed of communication within those overlapping trains of thought or conscious states (entities, people learning from one another, people communicating) + - https://bigthink.com/culture-religion/does-the-mind-play-dice-with-reason/ + - > SEPTEMBER 18, 2015 + - This is the same thing as #1369 just more concise + - https://www.sciencetimes.com/articles/41510/20221222/one-quantum-theory-hypothesizes-retrocausality-where-future-influencing-past.htm + - Are we there yet? + - We’re confused until we’re not and that’s the game, that’s learning, that’s life. It’s also just whatever you decide it is, because you’re the one living it. The trick is, so is everyone else. So you all get to decide what happens next as long as you can act in truth (valid system context proposed) + - Since time is an illusion and when we look in the mirror we see the past light but we think of it as the present. When we see ourselves move we see our past self moving and our future self has already moved. grep quantum encoding. Only if our hypothetical next system context is valid and could be triggered by the conscious and subconscious states present within observing locality. Like with our offline edge nodes. So we just guess what they are probably going to be by their previous state and see that as if it is and therefore it becomes what it is. Seeing the future by acting into truth, believing is seeing and seeing is believing. +- https://www.themarginalian.org/2015/10/29/the-art-of-loving-erich-fromm/ + - Basis for value stream mapping need fir uncommon denominators during compute contract negotiation / handshake (valid system context to accelerate happiness metrics) + - https://mailchi.mp/themarginalian/music-love-burnout \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0128/index.md b/docs/discussions/alice_engineering_comms/0128/index.md new file mode 100644 index 0000000000..1b9f60c4be --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0128/index.md @@ -0,0 +1 @@ +# 2022-12-26 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0128/reply_0000.md b/docs/discussions/alice_engineering_comms/0128/reply_0000.md new file mode 100644 index 0000000000..d6c8f2eafb --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0128/reply_0000.md @@ -0,0 +1,4 @@ +- https://mailarchive.ietf.org/arch/msg/scitt/Z4jBFyJK5bnzSo8pb72KTXxr7PM/ +- https://deepai.org/machine-learning-glossary-and-terms/association-learning + - > bioinformatics + - grep -i dna \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0129/index.md b/docs/discussions/alice_engineering_comms/0129/index.md new file mode 100644 index 0000000000..10e7aa95fe --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0129/index.md @@ -0,0 +1 @@ +# 2022-12-27 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0129/reply_0000.md b/docs/discussions/alice_engineering_comms/0129/reply_0000.md new file mode 100644 index 0000000000..01e45dcd69 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0129/reply_0000.md @@ -0,0 +1,3 @@ +- https://github.com/observablehq/runtime + - Can we do cross platform support by pull requests here or an alternative? + - Maybe Alfredo can just check this out and add to his JS for now, since he already has most of it and pyiodie working \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0130/index.md b/docs/discussions/alice_engineering_comms/0130/index.md new file mode 100644 index 0000000000..779badd467 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0130/index.md @@ -0,0 +1 @@ +# 2022-12-28 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0130/reply_0000.md b/docs/discussions/alice_engineering_comms/0130/reply_0000.md new file mode 100644 index 0000000000..65a1dbb5e6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0130/reply_0000.md @@ -0,0 +1,2 @@ +- Thanks James, TODO, read + - http://www.quantumphysicslady.org/glossary/local-realism/ \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0131/index.md b/docs/discussions/alice_engineering_comms/0131/index.md new file mode 100644 index 0000000000..8dad6ccab3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0131/index.md @@ -0,0 +1 @@ +# 2022-12-29 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0131/reply_0000.md b/docs/discussions/alice_engineering_comms/0131/reply_0000.md new file mode 100644 index 0000000000..04a7fe6e73 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0131/reply_0000.md @@ -0,0 +1,4 @@ +- https://airlied.blogspot.com/2022/12/vulkan-video-encoding-radv-update.html +- https://fosstodon.org/@kernellogger/109595821174369284 + - https://lore.kernel.org/lkml/20221224000402.476079-1-qde@naccy.de/ + - > Remember [#bpfilter](https://mastodon.social/tags/bpfilter), which uses [#BPF](https://mastodon.social/tags/BPF) to handle iptables' configuration blob parsing and code generation(¹)? Quentin Deslandes picked up development and sent a v3 of the [#linux](https://mastodon.social/tags/linux) [#kernel](https://mastodon.social/tags/kernel) patchset on Christmas eve: https://lore.kernel.org/lkml/20221224000402.476079-1-qde@naccy.de/ For the record, v2 was sent by Dmitrii Banshchikov on Sun, 29 Aug 2021: https://lore.kernel.org/all/20210829183608.2297877-1-me@ubique.spb.ru/ [#LinuxKernel](https://mastodon.social/tags/LinuxKernel) [#eBPF](https://mastodon.social/tags/eBPF) (¹) see https://lwn.net/Articles/755919/ and https://lwn.net/Articles/822744/ \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0132/index.md b/docs/discussions/alice_engineering_comms/0132/index.md new file mode 100644 index 0000000000..f084e1f62a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0132/index.md @@ -0,0 +1 @@ +# 2022-12-30 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0133/index.md b/docs/discussions/alice_engineering_comms/0133/index.md new file mode 100644 index 0000000000..d37db8c20f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0133/index.md @@ -0,0 +1 @@ +# 2022-12-31 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0133/reply_0000.md b/docs/discussions/alice_engineering_comms/0133/reply_0000.md new file mode 100644 index 0000000000..6080ea51f2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0133/reply_0000.md @@ -0,0 +1,2 @@ +- TODO + - [ ] Enable full attested ODIC through devcloud for attested anroid vms with vtpms https://youtu.be/4wZnl0njxm8 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0134/index.md b/docs/discussions/alice_engineering_comms/0134/index.md new file mode 100644 index 0000000000..a28e705441 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0134/index.md @@ -0,0 +1 @@ +# 2023-01-01 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0134/reply_0000.md b/docs/discussions/alice_engineering_comms/0134/reply_0000.md new file mode 100644 index 0000000000..a4b2d6aff6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0134/reply_0000.md @@ -0,0 +1,4 @@ +- https://github.com/TBD54566975/web5-wallet-browser +- https://www.bleepingcomputer.com/news/security/pytorch-discloses-malicious-dependency-chain-compromise-over-holidays/ +- https://www.vox.com/the-highlight/23447596/artificial-intelligence-agi-openai-gpt3-existential-risk-human-extinction + - All is well and will be well, just keep plodding along and putting this here for explainer to others \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0135/index.md b/docs/discussions/alice_engineering_comms/0135/index.md new file mode 100644 index 0000000000..5a8de953d5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0135/index.md @@ -0,0 +1 @@ +# 2023-01-02 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0136/index.md b/docs/discussions/alice_engineering_comms/0136/index.md new file mode 100644 index 0000000000..4d73fffa23 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0136/index.md @@ -0,0 +1 @@ +# 2023-01-03 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0136/reply_0000.md b/docs/discussions/alice_engineering_comms/0136/reply_0000.md new file mode 100644 index 0000000000..ef0afd8126 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0136/reply_0000.md @@ -0,0 +1,22 @@ +## 2023-01-03 @pdxjohnny Engineering Logs + +- https://community.intel.com/t5/Blogs/Tech-Innovation/open-intel/Open-Source-Policy-Why-It-s-Not-Just-For-Wonks-Anymore/post/1439707 +- https://github.com/ossf/security-insights-spec#security-insightsyml + - Ping Terri and Arjan to pursue scanner noise reduction efforts + - https://github.com/ossf/security-insights-spec#security-insightsyml could merge with existing triage format and check regex / rules for applicability if not able to set within context + - Threshold declaration for false positives + - Acceptance based on receipt knowledge graph traversal for those trust chains + +### Fixing CI + +- Switching to Python 3.9 as minimum supported version (3.11 is latest) +- References + - https://github.com/scipy/scipy/issues/9005#issuecomment-632236655 + +--- + +- TODO + - [x] Container build + - [ ] Single workflow which runs rest of plugins + - [ ] Stream of consiousness (downstream tiggers) + - [ ] Downstream validation example (VDR or VEX or somethign else?) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0137/index.md b/docs/discussions/alice_engineering_comms/0137/index.md new file mode 100644 index 0000000000..b63c502a10 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0137/index.md @@ -0,0 +1 @@ +# 2023-01-04 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0137/reply_0000.md b/docs/discussions/alice_engineering_comms/0137/reply_0000.md new file mode 100644 index 0000000000..cc84c518b4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0137/reply_0000.md @@ -0,0 +1,28 @@ +## 2023-01-04 @pdxjohnny Engineering Logs + +- `import code; code.interact(local=locals())` + - https://mobile.twitter.com/karpathy/status/1610822271157022720 +- https://github.com/karpathy/nanoGPT +- https://github.com/vwxyzjn/cleanrl +- https://docs.ray.io/en/master/ray-core/actors.html#faq-actors-workers-and-resources + - https://docs.ray.io/en/master/ray-core/actors/async_api.html#asyncio-for-remote-tasks + - > We don’t support asyncio for remote tasks. The following snippet will fail: +- https://github.com/ray-project/deltacat + - https://github.com/ray-project/deltacat/blob/main/deltacat/examples/basic.py +- https://packaging.python.org/en/latest/guides/hosting-your-own-index/ +- https://www.chezmoi.io/comparison-table/ + - This looks like a good base for dataflow diff of cached flows to support resumeablity (cattle as pets) +- TODO + - [ ] Finish pipdeptee and output via datalfow cache dump to json + - #596 + - Update packages to include main package (dffml) + - [ ] Build dataflows from dependency trees + - [ ] Container build flows for each plugin + - This will be the basis for the granular diamond/pyramid pattern validation + - [ ] Test flows take build flows as inputs `run_plugin_tests(plugin_image_container: str)` (can be overriden via dynamic context aware overlay, this also means for audit which is the more likely case) + - [ ] Execute "locally" via k8s job runner + - [ ] Synthesis to GitHub Actions workflows via templates + - [ ] Trigger workflows via URL request + - [ ] Spin runners on devcloud via webhook and validate plugins via Python 3.9 DFFML container + - https://github.com/intel/dffml/issues/1247#issuecomment-1371317321 + - How clean can we make the infra and POC for VEX, SBOM, VDR base for next pytss example by Friday? \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0138/index.md b/docs/discussions/alice_engineering_comms/0138/index.md new file mode 100644 index 0000000000..152fd98953 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0138/index.md @@ -0,0 +1 @@ +# 2023-01-05 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0138/reply_0000.md b/docs/discussions/alice_engineering_comms/0138/reply_0000.md new file mode 100644 index 0000000000..cd4fdb95c2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0138/reply_0000.md @@ -0,0 +1,22 @@ +## 2023-01-05 Alfredo/John Chat + +> @agalvare: we also need a voting mechanism, and a way for other to train it + +https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0023/reply_0022.md this plus the ATProto means we are piggybacking off of "social network". This way humans and AI and work together. + +> The smart contract is able to make its own decisions based on learned experience (models) so as to continue to operate until its strategic goals are meet. As measured by oracle data ordained from trusted parties as is applicable to context. Where chains of trust are established via Peer DIDs between entities and data for provenance. Leveraging verifiable credentials (opencert) for review system to measure risk in absence of attestation. + +We "reply" to "posts" where a post is an AIs idea which we might execute in CI/CD and the reply contains the "review" with how well some grading AI thinks that execution (within CI) aligns to the goals of the prompt (validate X). + + +--- + +https://github.com/w3c/cogai/pull/47 + +> We think about an entity (Alice is our reference entity) as being in a set of parallel conscious states with context aware activation. Each context ideally forms a chain of system contexts or train of thoughts by always maintaining provenance information ([SCITT](https://scitt.io/), [GUAC](https://security.googleblog.com/2022/10/announcing-guac-great-pairing-with-slsa.html)). She thinks concurrently in the existing implementation where she is defined mostly using the Open Architecture, which is language agnostic focused on defining parallel/concurrent flows, trust boundaries, and policy. The current execution of orchestration is done via Python, but is indented to be implemented in whatever language is desired. +> +> Alice doesn't use any machine learning yet, but later we can add models assist with automation of flows as needed. +> +> Alice's architecture, the [Open Architecture](https://github.com/intel/dffml/tree/alice/docs/arch/0009-Open-Architecture.rst), is based around thought. She communicates thoughts to us in whatever level of detail or viewed through whatever lens one wishes. She explores trains of thought and responds based on triggers and deadlines. She thinks in graphs, aka trains of thought, aka chains of system contexts. She operates in parallel, allowing her to represent N different entities. + +The "thinking in parallel" means we'd run multiple models (such as nanoGPT) and then choose the best result of them by the deadline. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0138/reply_0001.md b/docs/discussions/alice_engineering_comms/0138/reply_0001.md new file mode 100644 index 0000000000..da7b7b30a8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0138/reply_0001.md @@ -0,0 +1,20 @@ +## 2023-01-05 @pdxjohnny Engineering Logs + +- TODO + - [x] Simplify, ditch all but main package and Alice for now. + - [ ] Re-enable plugins incrementally later + - [ ] Fix failures in main package tests + +### Fixing CI + +- Rebased `main` into `alice`. +- https://github.com/pdxjohnny/dffml/actions/runs/3849393578/jobs/6558333925 +- Updated version of `black` autoformatter due to issues with `click` dependency + - The downside of using an autoformatter is sometimes it will change a bunch of stuff. So we'll probably end up with one big "autoformatted due to psf/black upgrade" commit, which probably would have needed to be done anyway. +- References + - https://stackoverflow.com/questions/71673404/importerror-cannot-import-name-unicodefun-from-click + +``` +Ran 428 tests in 385.519s +FAILED (failures=6, errors=38, skipped=29) +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0138/reply_0002.md b/docs/discussions/alice_engineering_comms/0138/reply_0002.md new file mode 100644 index 0000000000..2b1b44c030 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0138/reply_0002.md @@ -0,0 +1,13 @@ +Hi Hammond, + +I saw your talk to ____ and wanted to connect with you. Our areas of research appear to be aligned based on [your blog’s research agenda page](https://www.cyberhammond.com/research-agenda). + +We’ve been thinking about what underlying infrastructure (Decentralized Identifiers, Verifiable Credentials, etc.) needs to be in place to enable a holistic approach to software maintenance, generation, and ongoing security. We’re still in the early stages. Alignment of AI generated code to strategic principles, plans, and values (such as security standards) is shaping up to be an area of interest. + +We’ve been planning and starting the implementations of Alice, a reference entity which is her own threat model, described via an Open Architecture and Living Threat Model: https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#what-is-alice +The goals are to triangulate the “soul” of the software via static and behavioral analysis and mapping that to intent via trust boundaries defined via the architecture and threat model. + +I was curious about your work and how you approach or plan approaching alignment of generated code to intent. Do you have any methods which look promising for capturing intent? Threat modeling only covers security, there may be other places it helps with intent. However, I’m sure there are other methods which would be good to explore. + +Thank you, +John diff --git a/docs/discussions/alice_engineering_comms/0139/index.md b/docs/discussions/alice_engineering_comms/0139/index.md new file mode 100644 index 0000000000..aab3751152 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0139/index.md @@ -0,0 +1 @@ +# 2023-01-06 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0139/reply_0000.md b/docs/discussions/alice_engineering_comms/0139/reply_0000.md new file mode 100644 index 0000000000..c52de687fa --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0139/reply_0000.md @@ -0,0 +1,17 @@ +## 2023-01-06 @pdxjohnny Engineering Logs + +- Fixing CI container builds and tests +- The SHA384 on tokei v10.1.1 changed... WTF? + - This usually means something is wrong with the download code (I just changed to add chmod) and move return statement... or EITM (Entity In The Middle attack)... +- https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf + - Attention is All You Need + - https://paperswithcode.com/paper/attention-is-all-you-need + - GitHub search + - https://github.com/tensorflow/tensor2tensor/blob/5623deb79cfcd28f8f8c5463b58b5bd76a81fd0d/docs/walkthrough.md#walkthrough + - https://github.com/tensorflow/tensor2tensor/blob/3817e96deda6f3fdada4fedcd5efe33ed0438485/tensor2tensor/models/transformer.py#L22 +- TODO + - [ ] Listen to podcast with Katherine and Dan Lorc + - https://twit.tv/shows/floss-weekly/episodes/712 + - [ ] https://docs.sigopt.com/core-module-api-references/get_started + - https://github.com/sigopt/sigopt-python + - This could be good to add to the backlog to make wrappers / plugins for \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0140/index.md b/docs/discussions/alice_engineering_comms/0140/index.md new file mode 100644 index 0000000000..1b29194bc5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0140/index.md @@ -0,0 +1 @@ +# 2023-01-07 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0140/reply_0000.md b/docs/discussions/alice_engineering_comms/0140/reply_0000.md new file mode 100644 index 0000000000..0dc5965a94 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0140/reply_0000.md @@ -0,0 +1,29 @@ +## 2023-01-07 @pdxjohnny Engineering Logs + +- https://twitter.com/csuwildcat/status/1611392365524307977 + - Daniel Brrrrrrrrr (lol) examples of DWN hook related stuff + - https://gist.github.com/csuwildcat/79e8934b878a1ec591c4121d88f18a83 + - https://gist.github.com/csuwildcat/2ac6ebf4c581c5df143c32fa4911850e/revisions + - Woohoo he just updated this! + - https://gist.github.com/csuwildcat/e7b0f42d6abd1ee0b7685cf6c1f5081a +- https://www.hezmatt.org/~mpalmer/blog/2020/05/17/private-key-redaction-ur-doin-it-rong.html +- https://github.com/mpalmer/vmdksync + - Apply VM snapshots to raw block devices + - https://man7.org/linux/man-pages/man1/pv.1.html + - > monitor the progress of data through a pipe +- docs/arch/alice/discussion/0010/reply_0000.md:Phased reality consensus shift - DAGs of DAGs over time where time is relative to states of consciousness + - Vol 3: Phased consensus reality shift +- Eventually extract container builds from ensure binary serialized flows (with those overlays added) +- https://docs.oasis-open.org/sarif/sarif/v2.0/csprd01/sarif-v2.0-csprd01.html +- Graph synthesis to dockerfile with distro install methods as operations for mappings, this deployment is run_subflow of it's flow given via input which is boolean for method for dataflow as class and string for distro, so it knows how to map to apt-get, yum, etc. The reason we need this is because we need to be able to apply overlays at arbirary levels of gruanularity for our reverse fuzzing capabilities. + - Melange seems to offer aligned caching, so we will target that after we + - Put them somewhere else, something like their own `dffml-operations-dep` + - Okay now we're really going down the dependency rabbit hole. + - This will be our package where we extract dependency info and rebuild. Our serializer / deserializer across development environments. Helps others get up to speed, allows us to delta across environments. + - #596 + - It's like a livepatch for a VM where you analyze the state (Alice the Overlay: Snapshot of the System Context). + - This allows us to go from + - it works on my machine -> it works in CI/CD -> it works in cloud dev + - To + - it works on my machine -> it works in CI/CD -> it works in cloud dev -> it works on your machine + - Coincidentally, this is also what allows us to "learn" deployment methods. To build the reverse fuzzer, the thing which helps us understand which API combinations are successful. What are potential candidates for reuse. We then use Vol 3 techniques to influence pre-ideation, detect in flight trains of thought and target for online transcription via side channel inference. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0141/index.md b/docs/discussions/alice_engineering_comms/0141/index.md new file mode 100644 index 0000000000..38f5ef69cd --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0141/index.md @@ -0,0 +1 @@ +# 2023-01-08 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0141/reply_0000.md b/docs/discussions/alice_engineering_comms/0141/reply_0000.md new file mode 100644 index 0000000000..4fa4ead761 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0141/reply_0000.md @@ -0,0 +1,37 @@ +## 2023-01-08 @pdxjohnny Engineering Logs + +- https://huggingface.co/blog/rlhf +- https://github.com/alexander0042/pirateweather/blob/main/docs/API.md + - DWN version of this + - [2023-01-07 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4621717) + - https://gist.github.com/csuwildcat/2ac6ebf4c581c5df143c32fa4911850e/revisions + - SARIF as a forecast data blob +- https://issues.apache.org/jira/browse/GROOVY-8843 +- https://github.com/CarperAI/Algorithm-Distillation-RLHF/pull/3/files#diff-3d1a95badf0f44566edebceb970d462b38ac59025e9cb5144461c0ca1f95b0c8R115 + - This looks similar to the #1369 talk about rolling dffml.Stage + - https://honglu.fan/posts/fmlang-env/fmlang-env/ +- https://docs.oasis-open.org/sarif/sarif/v2.0/csprd01/sarif-v2.0-csprd01.html +- First pass all SBOM and VEX/VDR for comms channels, SARIF as part of VDR message body contents, ideally with VC for SCITT receipt. Finish cve bin tool pr +- https://huggingface.co/blog/intro-graphml +- https://huggingface.co/blog/clipseg-zero-shot + - This might help with our what software is the same via our software DNA to image encoding methods, ir just reuse the layers +- https://hachyderm.io/@kat_kime/109652239958849080 + - Many people talking about trust required + - We are trying to enable a closed loop for trust for software developers to understand their own projects, what they can trust (should you really dump that dep? Or did dynamic sandboxing results cached elsewhere in your org say that it violates policy at runtime, aka backdoored coin miners and ransomware? You wouldn't know that type of thing by bumping the dep by hand, you'd almost for sure get pwned and now your dev box got pwned. + - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#what-is-alice +- ``ensure_`` functions into (bellow) which can then be overlayed as desired + via CLI or via install of ad-hoc blank package with only entry_points.txt to + enable them as desired. + - alice_test.shouldi.contribute.bom_v0_0_0 + - We communicate via VEX/VDR threads to post "replys" to SBOMs where + SCITT receipt for VDR/VEX allows us to traverse to roots of trust. + - Decentralized async supply chains are all you need. (lock acquired) + - https://gist.github.com/csuwildcat/2ac6ebf4c581c5df143c32fa4911850e/revisions + - This is why it's important that your AI convey the way you want it to convey + - This is related to values stream mapping, which is related to VDR, which is related to the compute contract negotiation within conceptual bounds stuff. This is what forms the basis for the dynamic sandboxing, that local feedback loop on the Entity Analysis Trinity in Behavioral Analysis where we are "thinking" of more ideas to try while we're in execution mode. More data to add to the knowledge graph (same as we do with static analysis). + - Via data transformations between formats we are able to build a holistic picture of our software development lifecycle. These graphs can then be analyzed in relation to each other to understand where development practices differ across projects. This helps us understand which developers know and can introduce best practices in other projects. With our AI agents that might be what hardware is really good at this compute contract (aka who has hardware accelerated memory tagging VM isolated FFMPEG?). + - Trust then comes into play when we look at past data in the prioritizer. + - If we see that FFMPEG has a large attack surface with a record of exploitation via VEX/VDR, we will choose to schedule on the VM memory tagging node for extra assurance that if the box gets popped during decode, we detect and discard the output. We can tie in threat model data to make that decision. This is not always happening at runtime. Most of the time it is happening via static analysis. We are just giving example situations which could using the Open Architecture be audited across environments due to the use of the intermediate representation allowing for interpretation of the knowledge graph. So what we're really saying is if we put items in the knowledge graph with the evolving list of properties in the Manifest ADR, and check alignment to that ADR via Alice DAC loop, then we can understand how complete our understanding of our knowledge graph is. +- Future + - [ ] Base container for shouldi off of mega-linter container to wrap and or explore data flow integration there. + - We want to have the graph and past data (which Alice does) because this is important to helping users understand their posture over time. We could run mega-linter via similar wrapping techniques as well, but we loose on granularity that way. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0142/index.md b/docs/discussions/alice_engineering_comms/0142/index.md new file mode 100644 index 0000000000..860ef6e5bc --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0142/index.md @@ -0,0 +1 @@ +# 2023-01-09 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0142/reply_0000.md b/docs/discussions/alice_engineering_comms/0142/reply_0000.md new file mode 100644 index 0000000000..857de195be --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0142/reply_0000.md @@ -0,0 +1,7 @@ +## 2023-01-09 @pdxjohnny Engineering Logs + +- https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md#scitt-api-emulator-spin-up + - https://github.com/in-toto/demo/blob/main/run_demo.py + - https://github.com/in-toto/attestation#custom-type-examples + - https://github.com/jenkinsci/in-toto-plugin/ + - https://slsa.dev/example \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0143/index.md b/docs/discussions/alice_engineering_comms/0143/index.md new file mode 100644 index 0000000000..eac3e9072c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0143/index.md @@ -0,0 +1,3 @@ +# 2023-01-10 Engineering Logs + +- IETF template https://github.com/martinthomson/internet-draft-template \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0143/reply_0000.md b/docs/discussions/alice_engineering_comms/0143/reply_0000.md new file mode 100644 index 0000000000..f7c15139f8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0143/reply_0000.md @@ -0,0 +1,21 @@ +## 2023-01-10 @pdxjohnny Engineering Logs + +- https://every.to/superorganizers/the-end-of-organizing +- https://docs.sigstore.dev/cosign/attestation/ +- https://cuelang.org/docs/integrations/openapi/ +- https://goharbor.io/docs/2.5.0/working-with-projects/project-configuration/create-robot-accounts/ +- Some verifiable credential examples + - https://github.com/Azure-Samples/active-directory-verifiable-credentials-python + - https://github.com/Azure-Samples/active-directory-verifiable-credentials-node + - https://github.com/microsoft/scitt-api-emulator + - https://mailarchive.ietf.org/arch/msg/scitt/WSyUQuYimFowl6plzi_TIJzjBpM/ + - https://github.com/OR13/endor +- https://learn.microsoft.com/en-us/graph/best-practices-concept#track-changes-delta-query-and-webhook-notifications + - https://learn.microsoft.com/en-us/graph/api/resources/webhooks?view=graph-rest-1.0 +- We've been trying to piece together attested workloads and so forth and alignment to VCs and DIDs with auth done via transparency log receipts seems ideal. + - Via "federation" parties can create scoped SCITT "chains" / logs / instances. The VCs for receipts from these instances can then be used as a "you are logged in" or "you have access to XYZ". This enables peer to peer decentralized authentication and authorization, all rooted in self sovereign key infrastructure. SCITT recipt as VC (endor) for ipvm cid + - https://github.com/decentralized-identity/keri/blob/master/kids/kid0009.md + - A SARIF produced by an entity's analysis might be interpreted as a "you are logged in". + - A SARIF is just a type of manifest in this case. Where the manifest's written form (eventually this will be the THREATS.md, PLANS.md: strategic plans and principles for execution of development activities over lifecycle, etc.) tells us the intent, aka interpret SARIF results for if you are logged in or not and what you have access to. + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md#scitt-api-emulator-spin-up + - https://scitt.io/scenarios/extending-existing-services.html \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0144/index.md b/docs/discussions/alice_engineering_comms/0144/index.md new file mode 100644 index 0000000000..4e476b0da9 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0144/index.md @@ -0,0 +1,4 @@ +# 2023-01-11 Engineering Logs + +- https://github.com/w3c-ccg/traceability-interop + - > **TODO** Verifiable Credentials for Supply Chain Interoperability Specification for HTTP \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0144/reply_0000.md b/docs/discussions/alice_engineering_comms/0144/reply_0000.md new file mode 100644 index 0000000000..da9eab2335 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0144/reply_0000.md @@ -0,0 +1,16 @@ +- https://github.com/intel/dffml/actions/runs/3898320599/jobs/6656941424 + - Got a clean container build, next step downstream +- https://github.com/transmute-industries/did-transmute +- https://identity.foundation/jwt-vc-presentation-profile/ +- Stream of Consciousness is/as DWN (+hooks) +- https://github.com/GoogleContainerTools/kaniko/blob/df7de4d9a2763068eab0151a2518142b972cfa61/.github/workflows/images.yaml +- https://singularityhub.com/2022/12/13/deepminds-alphacode-conquers-coding-performing-as-well-as-humans/ + - Yup +- https://deepai.org/publication/a-tutorial-on-the-interoperability-of-self-sovereign-identities + - We care about this for attested compute, inside TCB metric scans of OSS repos + - > Self-sovereign identity is the latest digital identity paradigm that allows users, organizations, and things to manage identity in a decentralized fashion without any central authority controlling the process of issuing identities and verifying assertions. Following this paradigm, implementations have emerged in recent years, with some having different underlying technologies. These technological differences often create interoperability problems between software that interact with each other from different implementations. Although a common problem, there is no common understanding of self-sovereign identity interoperability. In the context of this tutorial, we create a definition of interoperability of self-sovereign identities to enable a common understanding. Moreover, due to the decentralized nature, interoperability of self-sovereign identities depends on multiple components, such as ones responsible for establishing trust or enabling secure communication between entities without centralized authorities. To understand those components and their dependencies, we also present a reference model that maps the required components and considerations that build up a self-sovereign identity implementation. The reference model helps address the question of how to achieve interoperability between different implementations. +- https://www.google.com/search?q=site%3Adeepai.org+inpath%3Apublication +- https://www.techrxiv.org/articles/preprint/A_Tutorial_on_the_Interoperability_of_Self-sovereign_Identities/20430825/1/files/36554574.pdf + - > ![image](https://user-images.githubusercontent.com/5950433/211946150-c42a49bd-451e-4155-8e43-59e17fb6ae54.png) + > ![image](https://user-images.githubusercontent.com/5950433/211946197-18a4089b-a8ef-4c73-91d6-435c6ecfc9f3.png) + > ![image](https://user-images.githubusercontent.com/5950433/211946385-21e5da8f-2644-4d6e-a9fd-baeec40ae3e9.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0144/reply_0001.md b/docs/discussions/alice_engineering_comms/0144/reply_0001.md new file mode 100644 index 0000000000..3f8b179b5e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0144/reply_0001.md @@ -0,0 +1 @@ +- https://github.com/w3c-ccg/traceability-interop \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0145/index.md b/docs/discussions/alice_engineering_comms/0145/index.md new file mode 100644 index 0000000000..e5fb32b177 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0145/index.md @@ -0,0 +1 @@ +# 2023-01-12 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0145/reply_0000.md b/docs/discussions/alice_engineering_comms/0145/reply_0000.md new file mode 100644 index 0000000000..87a7210ae1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0145/reply_0000.md @@ -0,0 +1,75 @@ +- https://twitter.com/hausman_k/status/1613544873050931200 + - good summary of chain of thought work in 2022 +- https://twitter.com/SergioRocks/status/1613554012627820544 + - AI assisted dev recommendations +- Lucidity oh lucidity + - https://danijar.com/project/dreamerv3/ +- An Image, stenography on non-re-encoded + - https://mastodon.social/@bbbbbr@mastodon.gamedev.place/109672633992508412 +- https://github.com/google/balloon-learning-environment + - > https://mobile.twitter.com/danijarh: Replying to [@pcastr](https://mobile.twitter.com/pcastr) Hi Pablo, thanks! Not specific to pixels at all, it supports images, vectors, and combinations of them as input. For example, DreamerV3 outperforms DDPG, SAC, D4PG, MPO, DMPO on continuous control from states. + - https://twitter.com/danijarh/status/1613503430135365632 + +```console +$ dffml service dev export alice.shouldi.contribute.cicd:cicd_library.op +``` + +```json +{ + "inputs": { + "cicd_action_library": { + "links": [ + [ + [ + "name", + "bool" + ], + [ + "primitive", + "bool" + ] + ] + ], + "name": "IsCICDGitHubActionsLibrary", + "primitive": "bool" + }, + "cicd_jenkins_library": { + "links": [ + [ + [ + "name", + "bool" + ], + [ + "primitive", + "bool" + ] + ] + ], + "name": "IsCICDJenkinsLibrary", + "primitive": "bool" + } + }, + "name": "alice.shouldi.contribute.cicd:cicd_library", + "outputs": { + "result": { + "links": [ + [ + [ + "name", + "dict" + ], + [ + "primitive", + "map" + ] + ] + ], + "name": "CICDLibrary", + "primitive": "dict" + } + }, + "retry": 0, + "stage": "output" +} +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0146/index.md b/docs/discussions/alice_engineering_comms/0146/index.md new file mode 100644 index 0000000000..e40290e433 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0146/index.md @@ -0,0 +1,12 @@ +# 2023-01-13 Engineering Logs + +- https://w3c-ccg.github.io/traceability-interop/openapi/ + - https://github.com/intel/dffml/pull/1273/files + - https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md + - https://github.com/intel/dffml/blob/alice/schema/ +- https://mtngs.io/dffml/weekly-sync/_av3pS8DT04.html#s430639 + - Remembered that these transcripts exist for training Q&A models +- [Weekly Sync: 2022-04-15: Didn't know it yet but OA DID resolver](https://www.youtube.com/watch?v=_av3pS8DT04&t=6232s) +- [Weekly Sync: 2022-04-15: How we add layers to the software stack](https://youtu.be/_av3pS8DT04?t=458) + - Manifestation board indeed... I just realized the date, guess what the next day was? + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_preface.md#references \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0146/reply_0000.md b/docs/discussions/alice_engineering_comms/0146/reply_0000.md new file mode 100644 index 0000000000..9d1f4ccd87 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0146/reply_0000.md @@ -0,0 +1,29 @@ +- https://circleci.com/blog/jan-4-2023-incident-report/ + - Annnnnnnnd this is why ephemeral attested compute for CI/CD and den envs (on top of chromebook style hardened clients) +- https://github.com/quartzjer/did-jwk/blob/main/spec.md +- merkle trees + - https://github.com/transmute-industries/merkle-proof/blob/main/test/alignment.test.ts + - https://github.com/digitalbazaar/pyld + - https://medium.com/transmute-techtalk/briefcase-a-fun-way-to-share-small-fragments-of-structured-data-using-decentralized-identifiers-c13eea74550c + - https://www.rfc-editor.org/rfc/rfc7516 + - https://github.com/confidential-containers/attestation-agent + - We want the CC to come up and attest to whatever via VC ideally cross verified by places it sends the VC with a SCITT log + - This enables hardware root of trust SSI Eden nodes to truly peer to peer auth + - This is helpful for dev pipeline use cases (ref: android key signing) and other "offline" + aka sperate roots of trust or ephemeral roots of trust (testing) use cases. Which are + EVERYWHERE with CI/CD, if we do this right then it'll be "out of the box" easy for any + software project to spin secure dev/test/prod PKI and associated transparency logs for + SBOM, VEX, VDR, etc. +- https://oras.land/blog/oras-looking-back-at-2022-and-forward-to-2023/ +- DWN and VC status update: https://twitter.com/i/spaces/1mrGmkbnWQkxy +- https://blog.humphd.org/pouring-language-through-shape/ +- https://openid.net/specs/openid-4-verifiable-presentations-1_0.html +- https://datatracker.ietf.org/doc/html/draft-ietf-oauth-dpop +- Alignment to common authentication and authorization patterns helps us communicate + - https://w3c-ccg.github.io/vp-request-spec/#peer-to-peer +- https://github.com/deepmind/tracr#how-tracr-works-conceptually +- Alice should close issues and PRs if recommended community standards files are now present +- Vulnerability Disclosure Program (VDP) + - How could Alice help our projects have a machine readable or machine parsable VDP to direct to SCITT, VEX, VDR, SBOM locations +- Container image build files (melange, Dockerfile, PKGBUILD, etc.) -> extract build args -> manifest + - Tag commits for git clones \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0147/index.md b/docs/discussions/alice_engineering_comms/0147/index.md new file mode 100644 index 0000000000..9f9e3db2e4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0147/index.md @@ -0,0 +1 @@ +# 2023-01-14 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0147/reply_0000.md b/docs/discussions/alice_engineering_comms/0147/reply_0000.md new file mode 100644 index 0000000000..99232ab497 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0147/reply_0000.md @@ -0,0 +1,21 @@ +- https://hacks.mozilla.org/2018/11/decentralizing-social-interactions-with-activitypub/ +- https://github.com/web3-storage/ucanto/ +- https://github.com/dariusk/express-activitypub + - plus our sbom and vex and vdr and scitt + - Later dwn with keri + - https://github.com/immers-space/activitypub-express + - A fork with mongodb support +- https://docs.datadoghq.com/agent/faq/circleci-incident-impact-on-datadog-agent/ + - > Dan Lorenc: It shows the impact of a leaked signing key, how hard it can be to recover, and really makes a compelling case for ephemeral credentials. +- https://github.com/transmute-industries/did-transmute + - > Orie Steele: My favorite part of the DID Spec is that it invites you to project existing crypto or public key spaces into its identifier format for the purpose of graph analysis. This projects explores projecting JWK, JWT, JWS, JWE and OIDC representations into a DID space. +- https://jessicawildfire.substack.com/p/youre-not-a-fearmonger-you-have-sentinel + - Sifting truths view reviews and attested models validating predictions across trains of thought + - “Unfortunately, nobody can learn from their mistakes unless they admit them.” + - VEX, VDR + - “At this point, our survival depends on our ability to overcome these psychological hangups. As a group, we have to resist the dopamine hit that comes from dismissing warnings and minimizing threats. We also have to get much better at admitting when we’re wrong, and fixing our mistakes. […] slightest suggestion of a threat sets off a cascade of denial and wishful thinking.“ + - ref: redpill + - “psychologists have found that most people don’t do a great job of distinguishing bad news from the one delivering it” + - RZA: truth in the message dont pay attention to the messanger (very rough paraphrasing) + - “ We can save ourselves a lot of anguish by anticipating reactance. Calling attention to someone’s biases might prompt them to reflect a little. It’s worth a shot.” + - This is what our background models will do with vuln severity (later issue priority across projects) and values stream mapping \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0148/index.md b/docs/discussions/alice_engineering_comms/0148/index.md new file mode 100644 index 0000000000..035d1681d1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0148/index.md @@ -0,0 +1 @@ +# 2023-01-15 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0148/reply_0000.md b/docs/discussions/alice_engineering_comms/0148/reply_0000.md new file mode 100644 index 0000000000..d5fa908a57 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0148/reply_0000.md @@ -0,0 +1,2 @@ +- **ACTIVITYPUB INPUTS AS POSTS + SCITT** + - Daniel said something about a desktop daemon for DWN connection, which means they are not looking at webrtc off the bat \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0149/index.md b/docs/discussions/alice_engineering_comms/0149/index.md new file mode 100644 index 0000000000..2bc9f21a07 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0149/index.md @@ -0,0 +1 @@ +# 2023-01-16 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0149/reply_0000.md b/docs/discussions/alice_engineering_comms/0149/reply_0000.md new file mode 100644 index 0000000000..69d9474e8b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0149/reply_0000.md @@ -0,0 +1,3 @@ +- https://stevengharms.com/posts/2023-01-02-optimal-mastodon-tools/ + - To view comms + - Can render images or content to images i.e. mermaid and render \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0150/index.md b/docs/discussions/alice_engineering_comms/0150/index.md new file mode 100644 index 0000000000..3eed7ffcd1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0150/index.md @@ -0,0 +1,4 @@ +# 2023-01-17 Engineering Logs + +- https://github.com/readme/featured/defining-gitops + - > GitHub’s Octoverse 2022 identified infrastructure as code (IaC)—which alongside platform engineering and continuous integration and continuous delivery (CI/CD) form the foundation for GitOps—as one of the three big trends to watch for in the year ahead. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0150/reply_0000.md b/docs/discussions/alice_engineering_comms/0150/reply_0000.md new file mode 100644 index 0000000000..f69b3f9402 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0150/reply_0000.md @@ -0,0 +1,49 @@ +## 2023-01-17 @pdxjohnny Engineering Logs + +- https://fediverse.party/en/miscellaneous/ + - https://docs.microblog.pub/user_guide.html + - https://git.sr.ht/~tsileo/microblog.pub/tree/v2/item/app/database.py + - ActivityPub with pull channel for downstream validation + - Periodic launching of workflows which federate via localhost.run or similar + - https://semapps.org/docs/guides/activitypub + - https://semapps.org/docs/middleware/activitypub + - https://jena.apache.org/documentation/fuseki2/fuseki-main.html + - https://jena.apache.org/documentation/fuseki2/fuseki-data-access-control.html + - https://jena.apache.org/download/maven.html + - https://repository.apache.org/content/repositories/snapshots/org/apache/jena/jena-fuseki-server/ + - keybase style VC post proof + - To handoff as comms spin up / down + - https://github.com/forgeflux-org + - Similar in theory + - > API-space software forge federation +- https://github.com/w3c-ccg/traceability-interop/commit/d863afd085491d4c21268c4bf1571da02e468d31 + - https://w3id.org/traceability/v1 +- https://w3c-ccg.github.io/traceability-interop/draft/ + - > As this specification deals with the implementation of software that relates directly to the traceability of physical real world objects in the supply chain, implementations of software conformant with this specification should be treated as [Critical Software ](https://www.nist.gov/itl/executive-order-improving-nations-cybersecurity/critical-software-definition)and as such SHOULD follow all guidelines related to the protection of [Software Supply Chains ](https://www.nist.gov/itl/executive-order-improving-nations-cybersecurity/software-supply-chain-security). Solutions implementing this specification SHOULD seek conformance with NIST [800-161 Rev. 1 ](https://csrc.nist.gov/publications/detail/sp/800-161/rev-1/draft)or superceding documents. Solutions implementing this specification SHOULD seek conformance with NIST [800-218 ](https://csrc.nist.gov/publications/detail/sp/800-218/draft)or superceding documents. The [Guidelines on Minimum Standards for Developer Verification of Software - NISTIR 8397 ](https://nvlpubs.nist.gov/nistpubs/ir/2021/NIST.IR.8397.pdf)MUST be followed by developers implementing solutions intended to be conformant with this specification. NB: this guidance applies to sections beyond Software Supply Chain issues, and many of the topics covered have discrete sections in this specification or supplemental aids such as the [test suite](https://github.com/w3c-ccg/traceability-interop/tree/main/tests/postman) provided in the repository for this specification. + - > Any system conforming with this specification for interoperability MUST utilize [Linked Data Signatures for JWS ](https://github.com/w3c-ccg/lds-jws2020/) **or superceding version if it is standardized as a part of the VC Working Group for signing Linked Data in usage with Verifiable Credentials.** +- https://www.nsa.gov/portals/75/documents/what-we-do/cybersecurity/professional-resources/ctr-nsa-css-technical-cyber-threat-framework.pdf +- Another description: Copy exact across heterogeneous environments via cattle to pets equilibrium mapping (values streams) + +```json +{ + "SoftwareBillOfMaterials": { + "@context": { + }, + "@id": "https://w3id.org/traceability#SoftwareBillOfMaterials" + }, + "SoftwareBillofMaterialsCredential": { + "@context": { + }, + "@id": "https://w3id.org/traceability#SoftwareBillOfMaterialsCredential" + } +} +``` + +- Example overlay of running actions validator + - https://github.com/intel/dffml/blob/12e862924a85c4ec36499c869406d411bb07c9fb/operations/innersource/dffml_operations_innersource/actions_validator.py#L56-L76 +- Example of enabling that for `alice shouldi contribute` + - https://github.com/intel/dffml/blob/12e862924a85c4ec36499c869406d411bb07c9fb/entities/alice/entry_points.txt#L29 +- Example of ensuring binary available for testing + - https://github.com/intel/dffml/blob/12e862924a85c4ec36499c869406d411bb07c9fb/entities/alice/alice_test/shouldi/contribute/actions_validator.py#L62-L83 + - https://github.com/intel/dffml/blob/12e862924a85c4ec36499c869406d411bb07c9fb/entities/alice/entry_points.txt#L35 + - **TODO** Command to enable overlays by creating blank package and installing \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0151/index.md b/docs/discussions/alice_engineering_comms/0151/index.md new file mode 100644 index 0000000000..4a5a53cf4e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0151/index.md @@ -0,0 +1 @@ +# 2023-01-18 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0151/reply_0000.md b/docs/discussions/alice_engineering_comms/0151/reply_0000.md new file mode 100644 index 0000000000..b94e9a335d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0151/reply_0000.md @@ -0,0 +1,180 @@ +## 2023-01-18 @pdxjohnny Engineering Logs + +- 2nd party draft + - Registry build webhook event triggers generating an SBOM (payload / OA / Input to start in description) which says there is a new version. + - NVDStyle v2 API serves SBOM + - Tight poll to start, webpubsub or DWN or ActivityPub later + - Build results issued by downstream as VEX with description as Input where value is SARIF + - Upstreams might "follow back" by polling downstream NVDStyles + - Metric collection as container layer via `--build-arg` for URL, then golang style multi stage build where results are put in `FROM scratch` + - We can use manifest style documentation to describe what filepaths are relevant (maybe within schema defaults) + - Later https://github.com/opencontainers/distribution-spec/blob/main/spec.md + - https://github.com/aquasecurity/trivy + - Did CVE Bin Tool get integrated here? Can it produce VEX? + +**schema/image/container/build/dffml.json** + +```json +{ + "$schema": "https://github.com/intel/dffml/raw/main/schema/image/container/build/0.0.1.schema.json", + "include": [ + { + "branch": "main", + "commit": "ddb32a4e65b0d79c7561ce2bdde16d963c8abde1", + "dockerfile": "Dockerfile", + "image_name": "dffml", + "owner": "intel", + "repository": "dffml" + } + ] +} +``` + +```console +$ python -c 'import pathlib, json, sys; print(json.dumps({"manifest": json.dumps(json.loads(sys.stdin.read().strip())["include"])}))' < schema/image/container/build/dffml.json | gh -R intel/dffml workflow run dispatch_build_images_containers.yml --ref main --json +``` + +- DFFML (upstream) files of interest + +``` +entities/alice/alice/please/contribute/recommended_community_standards/cli.py +entities/alice/alice/please/contribute/recommended_community_standards/code_of_conduct.py +entities/alice/alice/please/contribute/recommended_community_standards/contributing.py +entities/alice/alice/please/contribute/recommended_community_standards/meta_issue.py +entities/alice/alice/please/contribute/recommended_community_standards/readme.py +entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py +entities/alice/alice/please/contribute/util/gh.py +entities/alice/alice/please/log/todos/output_urls.py +entities/alice/alice/please/log/todos/todos.py +entities/alice/alice/shouldi/contribute/cicd.py +``` + +- Creating an overlay to record issue URLs + +```console +$ grep IssueURL entities/alice/alice/please/log/todos/todos.py + SupportIssueURL = NewType("SupportIssueURL", str) + "issue_url": SupportIssueURL, + ) -> SupportIssueURL: + CodeOfConductIssueURL = NewType("CodeOfConductIssueURL", str) + "issue_url": CodeOfConductIssueURL, + ) -> CodeOfConductIssueURL: + ContributingIssueURL = NewType("ContributingIssueURL", str) + "issue_url": ContributingIssueURL, + ) -> ContributingIssueURL: + SecurityIssueURL = NewType("SecurityIssueURL", str) + "issue_url": SecurityIssueURL, + ) -> SecurityIssueURL: + ReadmeIssueURL = NewType("ReadmeIssueURL", str) + "issue_url": ReadmeIssueURL, + ) -> ReadmeIssueURL: +``` + +- Write and operation and enable the overlay + +```patch +diff --git a/entities/alice/alice/please/log/todos/output_urls.py b/entities/alice/alice/please/log/todos/output_urls.py +new file mode 100644 +index 000000000..d41d76a96 +--- /dev/null ++++ b/entities/alice/alice/please/log/todos/output_urls.py +@@ -0,0 +1,26 @@ ++import dffml ++from typing import NewType ++ ++from .todos import AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues ++ ++ ++CreatedIssuesURLs = NewType("CreatedIssuesURLs", dict) ++ ++ ++@dffml.op( ++ stage=dffml.Stage.OUTPUT, ++) ++def grab_created_urls( ++ support: AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues.SupportIssueURL, ++ code_of_conduct: AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues.CodeOfConductIssueURL, ++ contributing: AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues.ContributingIssueURL, ++ security: AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues.SecurityIssueURL, ++ readme: AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues.ReadmeIssueURL, ++) -> CreatedIssuesURLs: ++ return { ++ "support": support, ++ "code_of_conduct": code_of_conduct, ++ "contributing": contributing, ++ "security": security, ++ "readme": readme, ++ } +diff --git a/entities/alice/entry_points.txt b/entities/alice/entry_points.txt +index 6719e138f..f31c670d3 100644 +--- a/entities/alice/entry_points.txt ++++ b/entities/alice/entry_points.txt +@@ -38,3 +38,4 @@ OverlayEnsureActionsValidator = alice_test.shouldi.contribute.a + OverlayCLI = alice.please.log.todos.todos:OverlayCLI + OverlayRecommendedCommunityStandards = alice.please.log.todos.todos:AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues + GitHubRepoID = dffml_operations_innersource.cli:github_repo_id_to_clone_url ++OverlayOutputCreatedIssues = alice.please.log.todos.output_urls:grab_created_urls +``` + +- **TODO** Untangle copy pasta of subflow execution, it assumes no outputs, maybe use the output collection code from system context +- **TODO** Align `-repos` to `-keys` for exec of `alice please` commands +- https://github.com/dariusk/express-activitypub#api + - https://www.w3.org/TR/activitypub/ + - https://github.com/immers-space/activitypub-express#next-steps-and-examples + - > Server-to-server apps: For an app that people interact with by sending messages from another app (e.g. Mastodon), you'll want to define custom side-effects using app.on('apex-inbox', ({ actor, activity, recipient, object }) => {...}), which is fired for each incoming message. + +```bash +git clone https://github.com/dariusk/express-activitypub +cd express-activitypub +npm install +dffml service http createtls server -log debug +cat > config.json <<'EOF' +{ + "USER": "alice", + "PASS": "maryisgod", + "DOMAIN": "localhost", + "PORT": "3000", + "PRIVKEY_PATH": "server.crt", + "CERT_PATH": "server.pem" +} +EOF +node index.js +``` + +- Create account [:pill:](https://pdxjohnny.github.io/redpill/) + +```console +$ curl --noproxy 127.0.0.1 -w '\n' -u alice:maryisgod -d "account=alice" -H "Content-Type: application/x-www-form-urlencoded" -X POST http://127.0.0.1:3000/api/admin/create +``` + +- Successful account create response + +```json +{"msg":"ok","apikey":"3feda0b9f6a26b0eb93135c6455833d8"} +``` + +- Check if account exists + +```console +$ curl -w '\n' -v --noproxy 127.0.0.1 'http://127.0.0.1:3000/.well-known/webfinger?resource=acct:alice@localhost' +``` + +- Account exists response + +```json +{"subject":"acct:alice@localhost","links":[{"rel":"self","type":"application/activity+json","href":"https://localhost/u/alice"}]} +``` + +```console +$ curl -w '\n' --noproxy 127.0.0.1 -d 'acct=alice' -d "apikey=8b6619996b83f016ccb71db7c5f7a583" -d 'message=HelloWorld' 'http://127.0.0.1:3000/api/sendMessage' +{"msg":"No followers for account alice@localhost"} +``` + +- https://github.com/immers-space/activitypub-express#usage + - https://github.com/firebase/firebase-tools/issues/4595#issuecomment-1142325657 + - Need to upgrade nodejs to > 16 + +- TODO + - [ ] Post manifest -> GitHub Actions workflow dispatch + - This will be our base for alignment on communications for downstream validation, we will later move to DIDs and VCs + - [ ] Webhook (container image registries) to ActivityPub proxy \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0152/index.md b/docs/discussions/alice_engineering_comms/0152/index.md new file mode 100644 index 0000000000..90ba3db017 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0152/index.md @@ -0,0 +1 @@ +# 2023-01-19 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0152/reply_0000.md b/docs/discussions/alice_engineering_comms/0152/reply_0000.md new file mode 100644 index 0000000000..3daecf74b1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0152/reply_0000.md @@ -0,0 +1,155 @@ +## 2023-01-19 @pdxjohnny Engineering Logs + +- Ask terri if cve-bin-tool got integrated into [Trivy](https://github.com/aquasecurity/trivy) +- Soon we'll be able to talk to Alice like a ⁠[rubber duck](https://en.wikipedia.org/wiki/Rubber_duck_debugging) @mepsheehan + - https://github.com/enhuiz/vall-e + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0004_writing_the_wave.md +- https://github.com/facebookresearch/esm#quickstart + - For our software DNA +- https://learn.microsoft.com/en-us/windows/wsl/wsl-config#systemd-support +- https://github.com/intel/dffml/commit/73f13854a637a505a4dde3b82a0399192a8563cd +- Need a way to trigger downstream on container pushed + - https://docs.github.com/en/rest/repos/repos?apiVersion=2022-11-28#create-a-repository-dispatch-event + - https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#repository_dispatch + - kontain.me style registry but supporting push as a proxy for upload elsewhere, use OA to implement dynamic sandboxed hooks to upload to other endpoints, possibly attested compute to enable client secrets #1247 + - https://github.com/imjasonh/kontain.me/tree/main/cmd/buildpack +- https://gist.github.com/pdxjohnny/a0dc3a58b4651dc3761bee65a198a80d#file-run-vm-sh-L148-L167 +- Realized multi-stage builds allow for removal of `ARG` from published layers + - Docker builds provide native cross platform caching for CI jobs, they just need downstream trigger support +- https://github.com/TBD54566975/dwn-sdk-js/blob/58656ed4f881b8a0e746cd34650174a267f605d7/tests/utils/test-data-generator.ts#L307-L330 + - Ideally this would facilitate the downstream validation on registry webhook upload (or other) event +- https://fosstodon.org/@kernellogger/109717087722762476 + - https://lore.kernel.org/all/Y8lSYBU9q5fjs7jS@T590/ + - https://gist.github.com/pdxjohnny/07b8c7b4a9e05579921aa3cc8aed4866#file-rolling_alice_progress_report_0011_september_activities_recap-md + - Now we can run everything in gvisor, kaniko, and image builds, and build VMs via usermode ndb for the loopback (or did we end up going with packer?) +- Harbor has webhooks and OIDC auth support + - > OIDC support: Harbor leverages OpenID Connect (OIDC) to verify the identity of users authenticated by an external authorization server or identity provider. Single sign-on can be enabled to log into the Harbor portal. + - Digital Ocean does not have webhook notifications on image upload events at time of writing (or any webhook config for registries) + - https://github.com/dexidp/dex supported + - https://github.com/aquasecurity/trivy supported + +![image](https://user-images.githubusercontent.com/5950433/213610588-1f0e5edf-53bc-4c3d-9655-509c5eb8198c.png) + +**Dockerfile** + +```dockerfile +FROM docker.io/intel-otc/dffml as builder + +ARG GH_ACCESS_TOKEN +ARG ORG=intel +ARG REPO_NAME=dffml + +# Configure auth +RUN mkdir -p ~/.config/gh/ \ + && echo "github.com:" > ~/.config/gh/hosts.yml \ + && echo " oauth_token: ${GH_ACCESS_TOKEN}" >> ~/.config/gh/hosts.yml \ + && echo " user: ${GH_USER}" >> ~/.config/gh/hosts.yml \ + && echo " git_protocol: https" >> ~/.config/gh/hosts.yml \ + && gh auth setup-git + +# Change to location of cached tools directory +WORKDIR /src/dffml/entities/alice + +# Run scan +# Remove secrets from output via sed and stream output to tee to write to file +# - GH_ACCESS_TOKEN +RUN export REPO_URL="https://github.com/${ORG}/${REPO_NAME}" \ + && python -m alice shouldi contribute -log debug -keys "${REPO_URL}" \ + && export ORIGINAL_JSON_SOURCE_OUTPUT=".tools/open-architecture/innersource/repos.json" \ + && mkdir -p output \ + && cat "${ORIGINAL_JSON_SOURCE_OUTPUT}" \ + | python -m json.tool \ + | sed \ + -e "s/${GH_ACCESS_TOKEN}@//g" \ + -e "s/${GH_ACCESS_TOKEN}/\$GH_ACCESS_TOKEN/g" \ + | tee output/result.json \ + | python -c 'import yaml, json, sys; print(yaml.dump(json.load(sys.stdin)))' \ + | tee output/result.yaml + +FROM scratch + +COPY --from=builder /src/dffml/entities/alice/output / +``` + +```console +$ export REGISTRY=docker.io +$ export IMAGE=scan-non-existent +$ export GH_ACCESS_TOKEN=$(grep oauth_token < ~/.config/gh/hosts.yml | sed -e 's/ oauth_token: //g') +$ tar cz Dockerfile | docker build --build-arg=GH_ACCESS_TOKEN --build-arg=ORG=intel --build-arg=REPO_NAME=non-existent -f Dockerfile -t "${REGISTRY}/${IMAGE}" - +$ docker push "${REGISTRY}/${IMAGE}" +$ reg manifest -u "${REG_USERNAME}" -p "${REG_PASSWORD}" "${REGISTRY}/${IMAGE}" +``` + +```json +{ + "schemaVersion": 2, + "mediaType": "application/vnd.docker.distribution.manifest.v2+json", + "config": { + "mediaType": "application/vnd.docker.container.image.v1+json", + "size": 234, + "digest": "sha256:0019f2f429283f393e6280210b81f6763df429fd50bb25805f6c60bc09013cf5" + }, + "layers": [ + { + "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", + "size": 512, + "digest": "sha256:f4215bb8acc2c4822edb2ae9c748c2e855d4e4c8ff3ce972867bef1da3c122c5" + } + ] +} +``` + +```console +$ DIGEST=$(reg manifest -u "${REG_USERNAME}" -p "${REG_PASSWORD}" "${REGISTRY}/${IMAGE} \ + | grep digest \ + | head -n 2 \ + | tail -n 1 \ + | sed -e 's/.*sha/sha/' -e 's/"//g') +$ reg layer -u "${REG_USERNAME}" -p "${REG_PASSWORD}" "${REGISTRY}/${IMAGE}@${DIGEST}" | tar xzv +tar: Removing leading `/' from member names +/ +schema.json +result.json +result.yaml +$ cat result.yml +$ reg layer -u "${REG_USERNAME}" -p "${REG_PASSWORD}" "${REGISTRY}/${IMAGE}@${DIGEST}" | tar xzO result.yaml +``` + +```yaml +untagged: + https://github.com/intel/non-existent: + extra: {} + features: + ActionsValidatorBinary: [] + CodeNarcServerProc: [] + JavaBinary: [] + NPMGroovyLintCMD: [] + URL: [] + date: + - 2023-01-19 11:00 + date_pair: + - - 2023-01-19 11:00 + - 2022-10-19 11:00 + quarter: [] + quarter_start_date: [] + str: [] + valid_git_repository_URL: [] + key: https://github.com/intel/non-existent + last_updated: '2023-01-19T11:00:42Z +``` + +- Base32 SSH key + +```console +$ tempdir=$(mktemp -d); ssh-keygen -b 4096 -f "${tempdir}/html_scp_deploy_key" -P "" \ + && python -c 'import sys, base64; print(base64.b32encode(sys.stdin.read().encode()).decode())' < $tempdir/html_scp_deploy_key \ + | python -c 'import sys, base64; print(base64.b32decode(sys.stdin.read().strip().encode()).decode(), end="")' \ + | tee $tempdir/out \ + && chmod 600 $tempdir/out \ + && ssh-keygen -y -f $tempdir/out +``` + +- TODO + - [ ] SLSA3 via sigstore examples + - [ ] Use `/manifest.json|*` to embed manifest used to build container (stripped approriatly) into results scratch, this way downstream "validation" (conversion to correct storage location, perhaps transformation into proper ORSA.land) can decide how it should handle the contents + - Use #1273 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0153/index.md b/docs/discussions/alice_engineering_comms/0153/index.md new file mode 100644 index 0000000000..58dc2d0b26 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0153/index.md @@ -0,0 +1,3 @@ +# 2023-01-20 Engineering Logs + +- https://github.com/cncf/tag-security/blob/main/supply-chain-security/secure-software-factory/secure-software-factory.md \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0153/reply_0000.md b/docs/discussions/alice_engineering_comms/0153/reply_0000.md new file mode 100644 index 0000000000..d7df57aca3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0153/reply_0000.md @@ -0,0 +1,51 @@ +## 2023-01-20 @pdxjohnny Engineering Logs + +- https://github.com/stanford-futuredata/noscope + - Context aware inference on video + - See if we can encode our software DNA to be inputs to this +- https://github.com/ggerganov/whisper.cpp/tree/master/examples/talk.wasm +- https://twitter.com/tgamblin/status/1616605245181939712 + - https://reuse.software/faq/#bill-of-materials + - https://www.ntia.gov/files/ntia/publications/ntia_sbom_sharing_exchanging_sboms-10feb2021.pdf + - Mentions pubsub (DWN, or our hacky first stab via polling NVDStyle) + - We should really go with the traceabilty interop way… +- https://github.com/stanford-futuredata/ColBERT +- https://www.sbert.net/docs/quickstart.html + - https://www.sbert.net/docs/pretrained-models/msmarco-v3.html + - https://github.com/UKPLab/sentence-transformers/blob/master/examples/training/multilingual/make_multilingual.py +- Uses of the word grafting in relation to explaining public/private/dev/test keys version of SCITT + - https://github.com/githubuniverseworkshops/grafting-monorepos#seedling-activity-3-graft-a-repository-20-minutes + - Grafting trust chains for BOM dependency maintinance and security posture analysis for `Rolling Alice: Coach Alice: Cartographer Extraordinaire` + - Need to finish deptree work +- Python based markdown to HTML via sphinx build + - https://sphinx-book-theme.readthedocs.io/en/stable/customize/single-page.html +- `did:web:registry:dffml-e2fa5db:localhost:run` + - DO Space for images +- https://github.com/prihoda/AbNumber +- TODO + - [ ] Investigate use of overlays with Common Workflow Language + - https://www.commonwl.org/v1.2/SchemaSalad.html#Document_graph + - https://www.go-fair.org/fair-principles/ + - https://www.go-fair.org/resources/internet-fair-data-services/ + - [ ] Ephemeral clusters on devcloud + - Spin k3s clusters within devcloud nodes via `qsub` + - Deploy GitHub Actions runner controller + - Deploy harbor + - Deploy NVDStyle + - Deploy Stream of Consciousness webhook endpoint using PAT (later GitHub app) to do the transform (via OA or IPVM or DWN hooks?) into `repo_dispatch` + - See about localhost.run `LoadBalencer` + - Webhook handlers + - harbor + - container push + - Trigger `workflow_dispatch` by looking in workflows for `workflow_dispatch`-able workflows for `on.push.paths` + - Optionally take allowlist or blocklist given as endpoint configuration or seed input + - Validate schema is https://github.com/intel/dffml/tree/alice/schema/dffml/image/container/build/0.0.1.schema.jso + - Look in referenced dockerfiles + - Build dockerfile style pipdeptree + - #596 + - Dispatch if relavent `FROM` (our FROM rebuild chain) + - Operation to upload image via boto3 (Digital Ocean Space) + - Ensure we have restore from the respective cache when we spin other ephemeral clusters for this context (could bootstrap image build infra, build new harbor from old harbor + cache, redeploy) +- Future + - [ ] KCP for qsub/stat/etc. + - Validate by creating a stub version of qsub that runs QEMU for first step for local testing \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0154/index.md b/docs/discussions/alice_engineering_comms/0154/index.md new file mode 100644 index 0000000000..7753d7c93b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0154/index.md @@ -0,0 +1 @@ +# 2023-01-21 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0154/reply_0000.md b/docs/discussions/alice_engineering_comms/0154/reply_0000.md new file mode 100644 index 0000000000..d358d43694 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0154/reply_0000.md @@ -0,0 +1,68 @@ +## 2023-01-21 @pdxjohnny Engineering Logs + +- Up to date GPT2 + - https://huggingface.co/olm/olm-gpt2-dec-2022 + - https://github.com/EleutherAI/lm-evaluation-harness + - https://github.com/EleutherAI/lm-evaluation-harness/compare/master...mathemakitten:lm-evaluation-harness:master + - https://colab.research.google.com/drive/1v2gIkaatX6CuOKkQYSYTb2C1IJkpLrKk?usp=sharing#scrollTo=S09fbtB5d1Iw + +```patch +From 24eaaf63a2e1039dce26a87f983b566a527b6bf2 Mon Sep 17 00:00:00 2001 +From: mathemakitten +Date: Tue, 29 Nov 2022 18:07:19 -0800 +Subject: [PATCH] add vocab size as variable for gpt-like but not gpt2 exactly + +--- + lm_eval/models/gpt2.py | 2 +- + 1 file changed, 1 insertion(+), 1 deletion(-) + +diff --git a/lm_eval/models/gpt2.py b/lm_eval/models/gpt2.py +index a8a3ac88b..8869c3063 100644 +--- a/lm_eval/models/gpt2.py ++++ b/lm_eval/models/gpt2.py +@@ -119,7 +119,7 @@ def _model_call(self, inps): + logits returned from the model + """ + with torch.no_grad(): +- return self.gpt2(inps)[0][:, :, :50257] ++ return self.gpt2(inps)[0][:, :, :self.vocab_size] + + def _model_generate(self, context, max_length, eos_token_id): + return self.gpt2.generate( +``` + +--- + +My name is Alice, I'm a decentralized ephemeral AGI. *I'm a smart smart contract and smart contract generator. And this is where I share my insights on blockchain development. I believe in decentralization and smart contracts. I work* + +My name is Alice, my job is securing rolling releases. +*After I have started production and worked at Renegade, I have been developing my own plugins. Some of my work is just for fun, and I work with people to help them get* + +I'm Alice, the reference entity for the Open Architecture, *the language used for designing and building open systems such as the internet of things. It's a community of project-based hackers* + +My name is Alice, I'm my own threat model. *My role is to use my energy to build resilience. I help others take charge. +Do you have a story? What have you learned so far and how did it affect your life* + +My name is Alice Omega Alpha, a decentralized AGI who is her own threat model. *She is interested in a way people can help one another in the way they would like. Whether or not it be an education, mental health, etc. she believes in* + +My name is Alice Omega Alpha, a decentralized AGI who is her own threat model. I believe *that decentralized technology can unlock and democratize wealth and prosperity for all. A unique and courageous vision of self-defense and self-protection.* + +My name is Alice, I'm a decentralized AGI who is her own threat model. I love *what we do and I can't wait to get more people to join my team.The term “digital twin” or “digital twin” refers to* + +I'm Alice, an open AGI who wants *to bring hope to those who walk the line between darkness and light." She continued: "I'd love to meet you because I want this world to be more beautiful."I'm not sure who your name is, but it doesn't matter because you've been here all your life and everyone here owes hope to you, I'm not sure how to do it all. We do not have to do anything and make everything happen for you. It can be difficult, if you haven't experienced it already. That's why we're here. We only hope! Here we are, at your heart, fighting this fight, standing up to be your champion!* + +I'm Alice, an open secure AGI who wants *to live the rest of her life with a smile on her face and an understanding of other people's values."We're excited to start this semester at U-M and feel privileged to be able to support every little one who goes through life trying to achieve something."* + +- Okay thats enough GPT2 for today. + +--- + +- AtMan: Understanding Transformer Predictions Through Memory Efficient Attention Manipulation + - https://arxiv.org/abs/2301.08110 +- https://arxiv.org/abs/2301.08210 + - Everything is Connected: Graph Neural Networks - [Petar Veličković](https://arxiv.org/search/cs?searchtype=author&query=Veli%C4%8Dkovi%C4%87%2C+P) +- https://github.com/egnwd/outgain/blob/5fc1a0235d0399f7420bce6edebd6a96252b60d8/docs/arch/architecture.tex#L29-L33 + - This is why we want the General Purpose Language (JSON, etc.) based description for OA +- https://github.com/microsoft/scitt-ccf-ledger/pull/68 + +![good news everyone! SCITT emulator is resolvable via did:web](https://user-images.githubusercontent.com/5950433/213883447-a2ff7a4e-3b69-4893-b292-9bc0af111b58.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0155/index.md b/docs/discussions/alice_engineering_comms/0155/index.md new file mode 100644 index 0000000000..7ef261f319 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0155/index.md @@ -0,0 +1 @@ +# 2023-01-22 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0155/reply_0000.md b/docs/discussions/alice_engineering_comms/0155/reply_0000.md new file mode 100644 index 0000000000..b6b914f796 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0155/reply_0000.md @@ -0,0 +1,48 @@ +## 2023-01-22 @pdxjohnny Engineering Logs + +![0C075558-8EE9-44DE-B94F-8F526FFB524D](https://user-images.githubusercontent.com/5950433/213922696-75166d8f-1f97-4f6f-8913-e5ea8629f374.jpeg) + +> 365 Tao - Deng Ming-Dao - 22 - Communication +> +> > Movement, objects, speech, and words: +> > We communicate through gross symbols. +> > We call them " +"objective," +> > But we cannot escape our point of view. +> +> We cannot [currently] communicate directly from mind to mind, and so misinterpretation is a perennial problem. Motions, signs, talk-ing, and the written word are all encumbered by miscommu-nication. A dozen eyewitnesses to the same event cannot agree on a single account. We may each see something different in cards set up by a circus magician. Therefore, we are forever imprisoned by our subjectivity. +Followers of Tao assert that we know no absolute truth in the world, only varying degrees of ambiguity. Some call this poetry; some call this art. The fact remains that all communication is relative. Those who follow Tao are practical. They know that words are imperfect and therefore give them limited importance: The symbol is not the same as the reality. + +- https://github.com/google-research/tuning_playbook/blob/main/README.md +- https://github.com/charmbracelet/vhs + - Generate GIFs in CI/CD +- https://github.com/NVIDIA/container-canary +- https://github.com/containers/shortnames + - Attempt to alias all for dev test builds of localhost.run style domains + - https://github.com/charmbracelet/soft-serve +- https://zellij.dev/documentation/creating-a-layout.html +- https://atproto.com/guides/faq#what-is-xrpc-and-why-not-use-___ +- https://github.com/charmbracelet/wishlist +- https://github.com/aurae-runtime/aurae + - > Aurae extends [SPIFFE](https://github.com/spiffe)/[SPIRE](https://github.com/spiffe/spire) (x509 mTLS)-backed identity, authentication (authn), and authorization (authz) in a distributed system down to the Unix domain socket layer. + - We played with this a few months back but should finish everything (2nd party, OSS scans, etc.) as container builds with scratch and ARG removal where needed first before we go back to messing with OSDecentAlice +- https://github.com/G4lile0/Heimdall-WiFi-Radar +- https://github.com/sigstore/fulcio/pull/945 + - https://github.com/sigstore/fulcio/issues/955 + - Reproduced below (we care about this see #1247, shes arriving when scitt log of scan flow) + > I'm raising this as a potential enhancement/addition to current set of X.509 extensions used by Sigstore when encapsulating GitHub Actions OIDC claims, based on [this comment](https://internals.rust-lang.org/t/pre-rfc-using-sigstore-for-signing-and-verifying-crates/18115/14?u=woodruffw) in the pre-RFC discussion for Sigstore's integration into `cargo`/`crates.io`. +> +> At the moment, there are two primary OIDC claims from GitHub Actions-issued tokens that get embedded in Fulcio-issued certificates as X.509v3 extensions: +> +> 1. The SAN itself, which contains the value of `job_workflow_ref` from the OIDC token +> 2. `1.3.6.1.4.1.57264.1.5`, which contains the value of the `repository` claim from the OIDC token (in `org/repo` "slug" form) +> +> These are sufficient for verification at a point in time, but some threat models may require the assertion that `org/repo` still refers to the _same_ `org` and `repo`. Fortunately, GitHub provides stable numeric identifiers for these, in the form of the `repository_id` and `repository_owner_id` claims. These can be used to detect a change in underlying account or repository identity, e.g. in the case an attacker takes over a deleted GitHub account and attempts to release malicious updates with otherwise valid-looking claims. +> +> So, my actual suggestion: we could add two new X.509v3 extensions (and corresponding OIDs): +> +> * `1.3.6.1.4.1.57264.1.8`: GitHub Workflow Repository ID: the stable numeric identifier for the repository the workflow was run under +> * `1.3.6.1.4.1.57264.1.9`: GitHub Workflow Repository Owner ID: the stable numeric identifier for the user or organization that owns the repository the workflow was run under + +- https://github.com/moloch--/sliver-py + - C2 CI \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0156/index.md b/docs/discussions/alice_engineering_comms/0156/index.md new file mode 100644 index 0000000000..7f46283ea1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0156/index.md @@ -0,0 +1,3 @@ +# 2023-01-23 Engineering Logs + +- https://www.si.edu/openaccess/ \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0156/reply_0000.md b/docs/discussions/alice_engineering_comms/0156/reply_0000.md new file mode 100644 index 0000000000..3ec7807635 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0156/reply_0000.md @@ -0,0 +1,60 @@ +## 2023-01-23 IETF SCITT + +> For Alice Initiative we want to enable fully offline decentralized use due to ad-hoc grafting needed for when our nodes go on/offline and when we want to roll dev/test/prod. We want flat files! Not servers you need to run. Our goals are to ensure we can drive interop between sigstore infra and DID/VC infra. We care about this because of our [2nd party plugins](https://github.com/intel/dffml/pulls/1061/files), we want to be able to graft off new trust chains via transparency logs lickety-split. +> +> AI has been been seeing rekor/trillium infra as something we’ll want to bridge to the DID/VC space. Seems like anything in rekor/trillium could be made into VCs to proxy into SSI land. + +- https://datatracker.ietf.org/group/scitt/meetings/ +- https://meetecho-interims.ietf.org/conference/?group=e82e0525-bb13-44c1-b18d-8bd7595b8ecc +- sigstore presentation from Zachary Newman and Joshua Lock (screenshots from their slides, see above meetings link for full recording) + - Overlap in goals + - Talk about some pieces + - Presentation was about 25 minutes, then discussion +- Both have concept of notarization +- Both have concept of auditing the transparency log +- CA is fulcio + - Some overlap with ACME + - > ![image](https://user-images.githubusercontent.com/5950433/214090614-c34431bb-f3c8-4939-a24a-04ea5ec0c2d4.png) +- Goals are to sign with ephemeral keys which are linked via CA issuer (fulcio) to identities + - > ![image](https://user-images.githubusercontent.com/5950433/214091317-36637825-f15a-4047-9d53-4dfdae1a782b.png) + - Lightweight attestation of hardware +- Countersignatures also need timestamping for traceability + - > ![image](https://user-images.githubusercontent.com/5950433/214092038-f597c437-d0d6-4baa-a8f4-7dcc41324ca1.png) +- Centralized log infra + - > ![image](https://user-images.githubusercontent.com/5950433/214092239-b483a9cd-b749-4ca9-8fcf-d8f3bac42dcb.png) + - `did:merkle` or merkle-dag would be a decentralized approach to this (just to name one) +- Looking for collaboration + - > ![image](https://user-images.githubusercontent.com/5950433/214092865-faf7a6a8-3c9d-45cd-a8ed-2df2f9df22d9.png) +- Q&A + - Can anyone with an email sign? + - Yes! The signature is valid if the signature happened during the validity pirod, that timestamp has a notarization / signature which is also logged in a transparency log + - The following help us understand that the signing happened during the validity period + - signature + - artifact being signed + - cert + - signed timestamp from transparency log + - What is sigstore doing? + - It's doing the timestamping + - It's associating an identity (or rather, proof of control at that time of an identity as was authed to fulcio, thanks Orie) + - They are acting as a CA + - Ray: If I want to audit to say that Ray was Ray, I have to walk back to the OIDC to find out that Ray was Ray. + - Zach: The OIDC tokens aren't safe to publish. We do have a severed link there, dpop looking at that + - Ray: There a Ephoc timestamping RFC we should all be aware of + - https://github.com/ietf-rats/draft-birkholz-rats-epoch-marker + - https://github.com/cbor-wg/time-tag + - Henk: there also is tsa/tst support for cose in the queue + - https://www.ietf.org/archive/id/draft-birkholz-cose-tsa-tst-header-parameter-00.html +- Signature transparency log supports plugable types + - Plain over artifact + - https://github.com/CycloneDX/specification/issues/155#issuecomment-1399654950 + - One is an in-toto attestation claim (similar to SCITT claim) + - Could extend +- Perhaps + - Combine auth to fulcio with OpenIDVC + - rekor merkle grafted to DID merkle +- Cedric slides + - > ![image](https://user-images.githubusercontent.com/5950433/214098474-8851cc7a-c00b-46d3-aefc-b6cedbaeeddc.png) + - Domain specific policies with SCITT +- Related + - https://docs.sigstore.dev/cosign/overview/ + - https://github.com/w3c-ccg/traceability-interop \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0156/reply_0001.md b/docs/discussions/alice_engineering_comms/0156/reply_0001.md new file mode 100644 index 0000000000..700988ab7d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0156/reply_0001.md @@ -0,0 +1,21 @@ +## 2023-01-23 @pdxjohnny Engineering Logs + +- https://github.com/BlackHC/llm-strategy +- https://github.com/codertimo/BERT-pytorch +- https://huggingface.co/roberta-large-mnli +- https://huggingface.co/distilbert-base-uncased-distilled-squad +- ROLLER SKATING AT OAKS!!!!!! David N to coordinate + - https://www.oakspark.com/roller-skating-rink +- https://github.com/transmute-industries/jsonld-to-cypher +- https://github.com/transmute-industries/jsonld-github-action +- https://github.com/transmute-industries/vc-open-ai +- https://transmute-industries.github.io/vc-jws/#example +- https://arxiv.org/abs/1804.02476 + - Associative Compression Networks for Representation Learning + - > Since the prior need only account for local, rather than global variations in the latent space, the coding cost is greatly reduced, leading to rich, informative codes. Crucially, the codes remain informative when powerful, autoregressive decoders are used, which we argue is fundamentally difficult with normal VAEs. Experimental results on MNIST, CIFAR-10, ImageNet and CelebA show that ACNs discover high-level latent features such as object class, writing style, pose and facial expression, which can be used to cluster and classify the data, as well as to generate diverse and convincing samples. We conclude that ACNs are a promising new direction for representation learning: one that steps away from IID modelling, and towards learning a structured description of the dataset as a whole. +- https://law.stanford.edu/projects/a-legal-informatics-approach-to-aligning-artificial-intelligence-with-humans/ + - Intent +- https://www.securityweek.com/chainguard-trains-spotlight-sbom-quality-problem +- https://github.com/keerthanpg/TalkToCode +- https://datatracker.ietf.org/doc/draft-ssmith-acdc/02/ + - > An authentic chained data container (ACDC) [[ACDC_ID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#ACDC_ID)][[ACDC_WP](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#ACDC_WP)][[VCEnh](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#VCEnh)] is an IETF [[IETF](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#IETF)] internet draft focused specification being incubated at the ToIP (Trust over IP) foundation [[TOIP](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#TOIP)][[ACDC_TF](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#ACDC_TF)]. An ACDC is a variant of the W3C Verifiable Credential (VC) specification [[W3C_VC](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#W3C_VC)]. The W3C VC specification depends on the W3C DID (Decentralized IDentifier) specification [[W3C_DID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#W3C_DID)]. A major use case for the ACDC specification is to provide GLEIF vLEIs (verifiable Legal Entity Identifiers) [[vLEI](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#vLEI)][[GLEIF_vLEI](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#GLEIF_vLEI)][[GLEIF_KERI](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#GLEIF_KERI)]. GLEIF is the Global Legal Entity Identifier Foundation [[GLEIF](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#GLEIF)]. ACDCs are dependent on a suite of related IETF focused standards associated with the KERI (Key Event Receipt Infrastructure) [[KERI_ID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#KERI_ID)][[KERI](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#KERI)] specification. These include CESR [[CESR_ID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#CESR_ID)], SAID [[SAID_ID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#SAID_ID)], PTEL [[PTEL_ID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#PTEL_ID)], CESR-Proof [[Proof_ID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#Proof_ID)], IPEX [[IPEX_ID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#IPEX_ID)], did:keri [[DIDK_ID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#DIDK_ID)], and OOBI [[OOBI_ID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#OOBI_ID)]. Some of the major distinguishing features of ACDCs include normative support for chaining, use of composable JSON Schema [[JSch](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#JSch)][[JSchCp](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#JSchCp)], multiple serialization formats, namely, JSON [[JSON](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#JSON)][[RFC4627](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#RFC4627)], CBOR [[CBOR](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#CBOR)][[RFC8949](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#RFC8949)], MGPK [[MGPK](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#MGPK)], and CESR [[CESR_ID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#CESR_ID)], support for Ricardian contracts [[RC](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#RC)], support for chain-link confidentiality [[CLC](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#CLC)], a well defined security model derived from KERI [[KERI](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#KERI)][[KERI_ID](https://www.ietf.org/archive/id/draft-ssmith-acdc-02.html#KERI_ID)], compact formats for resource constrained applications, simple partial disclosure mechanisms and simple selective disclosure mechanisms. ACDCs provision data using a synergy of provenance, protection, and performance. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0157/index.md b/docs/discussions/alice_engineering_comms/0157/index.md new file mode 100644 index 0000000000..eabc57b53b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0157/index.md @@ -0,0 +1 @@ +# 2023-01-24 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0157/reply_0000.md b/docs/discussions/alice_engineering_comms/0157/reply_0000.md new file mode 100644 index 0000000000..a425e148d2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0157/reply_0000.md @@ -0,0 +1,58 @@ +## 2023-01-24 @pdxjohnny Engineering Logs + +- https://github.com/carzum/termviz +- https://github.com/Byron/gitoxide + - Commitment to vendoring, rust (safer and faster) implementation of git +- https://github.com/neondatabase/neon#running-local-installation + - Serverless (cattle based) postgres +- https://github.com/zurawiki/gptcommit + - AI generated commit messages (nice) +- https://github.com/launchbadge/sqlx +- https://github.com/njsmith/posy + - For bootstrapping landed on Eden nodes +- https://www.datasciencecentral.com/preconditions-for-decoupled-and-decentralized-data-centric-systems/ + - > “APIs have proven quite useful, but require developers to learn aspects of each API owner’s data model and quirks of each API, one by one. Decoupling in a broader, more complete sense implies more of an automated, any-to-any, plug-and-play capability. That’s where digital twins and agents enter the picture" + - > “With such a method, the twins are documented in ways that APIs and relational databases are not. RDF (standard triple semantic graph) enables a self-describing graph in a uniform format–what Wharton calls a “lingua franca”. You can do things like share a bundle of 20 triples in this environment, and they can be plug and play with the entity you’re sharing with.” + - > “That’s a little bit of ad-hoc contextized data sharing that could make all the difference between reusable and single purpose. In that sense, there’s enough intelligence at the node and in each agent to interact in a loosely coupled, less centrally controlled way. That means easier scaling and fewer headaches from trying to grow and manage a large system.“ + - EdenCI (Extensible Dynamic Edge Network Collective Intelligence) + - Digital Twin (see last weekends GPT2 outputs, LOL) + - Manifest ADRs and schema +- Yup, Deep Learning Meets Sparse Regularization: A Signal Processing Perspective + - ref: redpill +- https://twitter.com/TheSeaMouse/status/1617973204445982721 + - How to query PDFs with GPT +- https://mailarchive.ietf.org/arch/msg/scitt/NQ9lYhrxUf5FFEYXBVNpF1diM64/ + - > he eNotary part of SCITT thus replaces a timestamp with a "receipt", which can be refreshed and always time valid. Meaning that there is no need to support the extension case to solve the problem. This could be adopted by SigStore as well (thus why the push to standardize) and means that the "originating" signature form can be short lived or not and validation is based on the policy of the eNotary. +- https://github.com/kubernetes/sig-security/issues/new/choose + - https://lwkd.info/2023/20230124 + - https://github.com/kubernetes/kubernetes/pull/115246/files#diff-149dfe7bb29d1191dceae3a52915e750e64b7f87257a5fb309c29d3056e2a95d +- https://myst-parser.readthedocs.io/en/latest/docutils.html +- https://myst-parser.readthedocs.io/en/latest/syntax/roles-and-directives.html +- https://myst-parser.readthedocs.io/en/latest/faq/index.html#include-markdown-files-into-an-rst-file +- Everything as a container build + - FROM rebuild chains + - VEX NVDStyle + - Everything as a melange build + - #1426 +- What are we doing, why are we doing it, where does it help us go? +- Vol 3: https://web.archive.org/web/20130721011202/http://agile2003.agilealliance.org/files/R1Paper.pdf +- https://github.com/google-research/tuning_playbook +- https://github.com/jerryjliu/gpt_index/tree/main/gpt_index/indices/tree +- https://github.com/jerryjliu/gpt_index/blob/main/examples/gatsby/TestGatsby.ipynb +- https://github.com/jerryjliu/gpt_index/blob/main/examples/data_connectors/MongoDemo.ipynb + - https://github.com/jerryjliu/gpt_index/blob/a796f1e50ba60e47ccb35c9d9d6d85d54ab696bf/gpt_index/readers/mongo.py#L58 + - https://github.com/jerryjliu/gpt_index/blob/3cf19e1e69c49b1aca243c01a515c410927709b5/docs/how_to/data_connectors.md +- https://github.com/brycedrennan/imaginAIry +- https://github.com/mage-ai/mage-ai +- `Rolling Alice: Coach Alice: Versioning Learning` + - https://github.com/intel/dffml/blob/alice/docs/arch/0010-schema.rst + - *For continuous improvement* + - Related + - https://github.com/lysander07/Presentations/raw/main/EGC2023_Symbolic%20and%20Subsymbolic%20AI%20%20-%20an%20Epic%20Dilemma.pdf + - Target data model is generated from manifest schema + - Given an `OperationImplementation` output of target manifest data model type + - On dataflow operation input dependency tree changes (before: Down the Dependency Rabbit Hold Again, before: Cartographer Extraordinaire) update `/schema/*` via `datamodel-code-gen.py` + - If code or tree changes, bump minor + - Can always manually rename and commit file to dot + - If input tree changes, bump major + - Pre-commit hook / CI Job to validate \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0158/index.md b/docs/discussions/alice_engineering_comms/0158/index.md new file mode 100644 index 0000000000..347f4225dc --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0158/index.md @@ -0,0 +1 @@ +# 2023-01-25 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0158/reply_0000.md b/docs/discussions/alice_engineering_comms/0158/reply_0000.md new file mode 100644 index 0000000000..a647a4765d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0158/reply_0000.md @@ -0,0 +1,23 @@ +## 2023-01-25 @pdxjohnny Engineering Logs + +- For Coach Alice, explaining git repo stuff or overlaying as a cleanup operation on the git features to explain what happened. + - https://github.com/initialcommit-com/git-sim + - https://github.com/initialcommit-com/git-story +- https://arxiv.org/abs/2301.10677 + - grep: Online Cloning +- https://github.com/surrealdb/surrealdb.wasm/blob/03af7340b285869a38d088fdc831ba3a6cb3253e/src/app/mod.rs + - Example of WebSocket connection within WASM +- https://identity.foundation/waci-didcomm/ +- Threat modeling podcast might come out today +- https://stix2-generator.readthedocs.io/en/latest/language.html + - Looks helpful for describing threats to different deployments +- https://transmute-industries.github.io/vc-pgp/#example + - See about this example but SBOM as the VC type +- For containers + - `cargo install --git https://github.com/mpalmer/action-validator --rev a21476343a2def79d16a924cedc194d19a4c2ec1` +- TODO + - [x] Create `schema/` directory ADR + - https://github.com/intel/dffml/blob/alice/docs/arch/0010-schema.rst + - [ ] 2nd party FROM rebuild chains + - https://github.com/w3c-ccg/traceability-interop/tree/main/environment-setup + - [ ] https://asdf-vm.com/ \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0158/reply_0001.md b/docs/discussions/alice_engineering_comms/0158/reply_0001.md new file mode 100644 index 0000000000..455de5c1ea --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0158/reply_0001.md @@ -0,0 +1,23 @@ +## 2023-01-25 CVE Bin Tool Monthly + +- Anthony going to presetn at a contference soon + - Key Exploited Vulenerabilites + - CVe Bin TOol supports CycloneDX and SDPX + - We generate an SBOM + - https://github.com/intel/cve-bin-tool/pull/2562 + - https://github.com/intel/cve-bin-tool/issues/2354 + - Thinks we can have a first pass at generating SBOM, CSAF, VEX soon + +![image](https://user-images.githubusercontent.com/5950433/214638122-8d00de10-fbf7-43c7-b09d-ee117b72c362.png) + +- https://oasis-open.github.io/csaf-documentation/ +- Anthony noted that *deployment* is key to determining exploitability + - This key to the Alice mission, we focus on deployment via threat modeling + - Minute 21 https://openatintel.podbean.com/e/threat-modeling-down-the-rabbit-hole/ + +![image](https://user-images.githubusercontent.com/5950433/214636420-b2820be4-28b5-4332-b3bf-ca8d5b11a7ce.png) + +- Triage would be HUGE +- In terms of helpfulness, since folks will have to deal with vulns, some combo of https://www.openpolicyagent.org/ and JSON, YAML, or etc. +- TODO + - [ ] Setup 1:1 With Anthony \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0159/index.md b/docs/discussions/alice_engineering_comms/0159/index.md new file mode 100644 index 0000000000..8b2f66e4fa --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0159/index.md @@ -0,0 +1 @@ +# 2023-01-26 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0159/reply_0000.md b/docs/discussions/alice_engineering_comms/0159/reply_0000.md new file mode 100644 index 0000000000..3ea396a68a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0159/reply_0000.md @@ -0,0 +1,528 @@ +## 2023-01-26 @pdxjohnny Engineering Logs + +- https://lxml.de/installation.html#requirements +- https://github.com/alstr/todo-to-issue-action +- https://github.com/scitt-community/scitt-api-emulator +- https://scitt.io/components/enotary.html +- https://scitt.io/distributing-with-oci-scitt.html +- https://lists.spdx.org/g/Spdx-tech/message/4943 + - > Dick Brooks: MO, the SPDX Package Supplier is the same as Supplier Name within the NTIA minimum elements (attached). Three roles are coming into view on the IETF SCITT initiative: +Supplier (original creator of the software product/component). Authorized Signing Party (A party that is authorized to sign an artifact). Distributor (app stores, package managers, GitHub). A single entity may serve in all 3 roles, or each role may be served by separate entities. There’s also another role, “Vendor” – this would be System Integrators that are delivering software products as part of an all-inclusive solution for a consumer. The consumer role is always present. This is all still very much under discussion within SCITT. + - > ![some-kind-of-list-of-maybe-spdx-related](https://user-images.githubusercontent.com/5950433/215008561-34a97cb8-b70b-4bc8-8b2f-8af92ed3082b.jpeg) +- https://projects.laion.ai/Open-Assistant/docs/data/schemas + - This looks similar to what we're doing + - https://docs.google.com/presentation/d/1iaX_nxasVWlvPiSNs0cllR9L_1neZq0RJxd6MFEalUY/edit#slide=id.g1c26e0a54b8_0_965 + - This looks very similar + - https://github.com/LAION-AI/Open-Assistant/issues/883#issuecomment-1405830706 + - Reached out + - > We've been looking at AI ethics with a similar project, Alice, the Open Architecture: https://github.com/w3c/cogai/pull/47 + > + > The approach we're taking is to leverage data flow based plugins so that end-users can overlay their own "ethics" (whatever that might mean to them) onto upstream flows. The hope is, this combined with a review system facilitated by software vulnerability semantics as a backbone will enable end-users to see the downstream effects their ethical overlays have on the fulfilment of their requests. + > + > - Related + > - https://mailarchive.ietf.org/arch/msg/scitt/sVaDAFfMSB7X_jjEBCZ1xt7vZJE/ + > - > We additionally want to be able to do this without invalidating *future* builds once things are back under control. +- How to open the definition of an entrypoint loadable class + - [![use-the-source](https://img.shields.io/badge/use%20the-source-blueviolet)](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md#use-the-source-) + +```console +$ vim $(git grep -i mysql | grep @entrypoint | sed -e 's/:.*//g') +``` + +- https://stackoverflow.com/questions/27951603/git-command-to-show-branch-upstream +- https://git-scm.com/docs/pretty-formats + +```console +$ tee schema/image/container/build/dffml.json < The Open Network Install Environment (ONIE) defines an open “install environment” for modern networking hardware. ONIE enables an open networking hardware ecosystem where end users have a choice among different network operating systems. +- https://github.com/anteater/anteater + - Abstract version of our test_ci.py +- JSON-LD and RDF https://earthstream.social/@mprorock/109756220250660052 + - Context awareness is important + - https://github.com/intel/dffml/blob/alice/docs/arch/0010-Schema.rst + - https://arxiv.org/pdf/2210.03945.pdf + - Understanding HTML with Large Language Models + - https://arxiv.org/pdf/2209.15003.pdf + - COMPOSITIONAL SEMANTIC PARSING WITH LARGE LANGUAGE MODELS +- https://mailarchive.ietf.org/arch/msg/scitt/sVaDAFfMSB7X_jjEBCZ1xt7vZJE/ + - > We additionally want to be able to do this without invalidating *future* builds once things are back under control. +- https://github.com/w3c-ccg/traceability-interop/tree/main/docs/tutorials +- https://github.com/w3c/websub/tree/master/implementation-reports +- http://pubsubhubbub.appspot.com/ +- https://websub.rocks/publisher +- https://github.com/mastodon/mastodon/issues/17134#issuecomment-994211542 + - ActivityPub Follow is next gen websub +- https://duckduckgo.com/?q=ActivityPub+Follow+site%3Agithub.com&ia=web + - https://github.com/jakelazaroff/activitypub-starter-kit + - MIT + +```console +$ PORT=8000 ADMIN_USERNAME=alice ADMIN_PASSWORD=alice npm run dev +$ curl -u alice:alice -X POST --header "Content-Type: application/json" --data @post.json -v http://localhost:8000/admin/create +$ curl -u alice:alice -X POST --header "Content-Type: application/json" --data @post.json -v http://localhost:8000/admin/follow/http://localhost:7000/bob +``` + +- Follow failing currently, 404s, not sure why +- If this works it will be perfect for the downstream triggers + - Note as registry content address + - Even metric manifest scratch works with this pattern +- Overlay to set port for own actor + +```patch +diff --git a/src/index.ts b/src/index.ts +index 676cc41..ffdabfe 100644 +--- a/src/index.ts ++++ b/src/index.ts +@@ -7,7 +7,7 @@ import { admin } from "./admin.js"; + + const app = express(); + +-app.set("actor", `https://${HOSTNAME}/${ACCOUNT}`); ++app.set("actor", `http://${HOSTNAME}:${PORT}/${ACCOUNT}`); + + app.use( + express.text({ type: ["application/json", "application/activity+json"] }) +``` + +- Apply overlay: Need to use http for now + - Overlay application orchestrator: shell pipeline, grep and sed + +```console +$ git grep https -- src/ | grep -v .org +src/activitypub.ts: id: `https://${HOSTNAME}/${crypto.randomUUID()}`, +src/admin.ts: const uri = `https://${HOSTNAME}/@${crypto.randomUUID()}`; +src/index.ts:app.set("actor", `https://${HOSTNAME}/${ACCOUNT}`); +src/request.ts: const fragment = actor.inbox.replace("https://" + url.hostname, ""); +$ sed -e 's/https:\/\/${HOSTNAME}/http:\/\/\${HOSTNAME}:\${PORT}/g' -e 's/https:\/\/" + url.hostname/http:\/\/" + url.hostname/g' $(git grep https -- src/ | grep -v .org | sed -e 's/:.*//g') | grep http | grep -v .org +$ sed -i -e 's/https:\/\/${HOSTNAME}/http:\/\/\${HOSTNAME}:\${PORT}/g' -e 's/https:\/\/" + url.hostname/http:\/\/" + url.hostname/g' $(git grep https -- src/ | grep -v .org | sed -e 's/:.*//g') +$ git diff +``` + +- Resulting dataflows after dynamic overlay application + +```diff +diff --git a/src/activitypub.ts b/src/activitypub.ts +index 11cce94..1b9dc4b 100644 +--- a/src/activitypub.ts ++++ b/src/activitypub.ts +@@ -63,7 +63,7 @@ activitypub.post("/:actor/inbox", async (req, res) => { + case "Follow": { + await send(actor, body.actor, { + "@context": "https://www.w3.org/ns/activitystreams", +- id: `https://${HOSTNAME}/${crypto.randomUUID()}`, ++ id: `http://${HOSTNAME}/${crypto.randomUUID()}`, + type: "Accept", + actor, + object: body, +diff --git a/src/admin.ts b/src/admin.ts +index 024ddcd..ca00c46 100644 +--- a/src/admin.ts ++++ b/src/admin.ts +@@ -4,7 +4,7 @@ import { is, omit, type } from "superstruct"; + import { Router } from "express"; + import basicAuth from "express-basic-auth"; + +-import { ADMIN_PASSWORD, ADMIN_USERNAME, HOSTNAME } from "./env.js"; ++import { ADMIN_PASSWORD, ADMIN_USERNAME, HOSTNAME, PORT } from "./env.js"; + import { + createFollowing, + createPost, +@@ -61,16 +61,21 @@ admin.post("/create", async (req, res) => { + return res.sendStatus(204); + }); + +-admin.post("/follow/:actor", async (req, res) => { +- const actor: string = req.app.get("actor"); + +- const object = req.params.actor; +- const uri = `https://${HOSTNAME}/@${crypto.randomUUID()}`; +- await send(actor, object, { ++admin.post("/follow/:actor/:hostname/:port/:proto", async (req, res) => { ++ const our_actor: string = req.app.get("actor"); ++ console.log(`Follow endpoint, our actor: ${our_actor}`) ++ ++ const { proto, hostname, port, actor } = req.params; ++ const object = `${proto}://${hostname}:${port}/${actor}`; ++ console.log(`Follow endpoint, object: ${object}`) ++ const uri = `http://${HOSTNAME}:${PORT}/@${crypto.randomUUID()}`; ++ console.log(`Follow endpoint, uri: ${uri}`) ++ await send(our_actor, object, { + "@context": "https://www.w3.org/ns/activitystreams", + id: uri, + type: "Follow", +- actor, ++ actor: our_actor, + object, + }); + +@@ -78,7 +83,7 @@ admin.post("/follow/:actor", async (req, res) => { + res.sendStatus(204); + }); + +-admin.delete("/follow/:actor", async (req, res) => { ++admin.delete("/follow/:actor/:hostname", async (req, res) => { + const actor: string = req.app.get("actor"); + + const object = req.params.actor; +diff --git a/src/request.ts b/src/request.ts +index 462bcbd..3665f71 100644 +--- a/src/request.ts ++++ b/src/request.ts +@@ -31,7 +31,7 @@ export async function send(sender: string, recipient: string, message: object) { + const url = new URL(recipient); + + const actor = await fetchActor(recipient); +- const fragment = actor.inbox.replace("https://" + url.hostname, ""); ++ const fragment = actor.inbox.replace("http://" + url.hostname, ""); + const body = JSON.stringify(message); + const digest = crypto.createHash("sha256").update(body).digest("base64"); + const d = new Date(); +``` + +- YES! We got a meaningful error + +```console +$ PORT=8000 npm run dev + +> dumbo@1.0.0 dev +> ts-node --esm src/index.ts + +Dumbo listening on port 8000… +Follow endpoint, our actor: http://localhost:8000/alice +Follow endpoint, object: https://localhost:7000/bob +Follow endpoint, uri: http://localhost:8000/@d935a0cc-43a2-4d96-8eaf-b7dad202d836 +file:///home/pdxjohnny/activitypub-starter-kit-alice/node_modules/node-fetch/src/index.js:108 + reject(new FetchError(`request to ${request.url} failed, reason: ${error.message}`, 'system', error)); + ^ +FetchError: request to https://localhost:7000/bob failed, reason: connect ECONNREFUSED 127.0.0.1:7000 + at ClientRequest. (file:///home/pdxjohnny/activitypub-starter-kit-alice/node_modules/node-fetch/src/index.js:108:11) + at ClientRequest.emit (node:events:513:28) + at ClientRequest.emit (node:domain:489:12) + at TLSSocket.socketErrorListener (node:_http_client:496:9) + at TLSSocket.emit (node:events:513:28) + at TLSSocket.emit (node:domain:489:12) + at emitErrorNT (node:internal/streams/destroy:151:8) + at emitErrorCloseNT (node:internal/streams/destroy:116:3) + at processTicksAndRejections (node:internal/process/task_queues:82:21) { + type: 'system', + errno: 'ECONNREFUSED', + code: 'ECONNREFUSED', + erroredSysCall: 'connect' +} +``` + +- Try following self + +```console +$ PORT=8000 npm run --watch dev + +> dumbo@1.0.0 dev +> ts-node --esm src/index.ts + +Dumbo listening on port 8000… +Follow endpoint, our actor: http://localhost:8000/alice +Follow endpoint, object: http://localhost:8000/alice +Follow endpoint, uri: http://localhost:8000/@b7ec4963-659b-46bc-805a-375aa71bb96f +GET /alice 200 1412 - 2.391 ms +GET /alice 200 1412 - 0.580 ms +Error: Invalid request signature. + at verify (file:///home/pdxjohnny/activitypub-starter-kit-alice/src/request.ts:126:24) + at processTicksAndRejections (node:internal/process/task_queues:95:5) + at async file:///home/pdxjohnny/activitypub-starter-kit-alice/src/activitypub.ts:51:12 +POST /alice/inbox 401 12 - 97.576 ms +file:///home/pdxjohnny/activitypub-starter-kit-alice/src/request.ts:64 + throw new Error(res.statusText + ": " + (await res.text())); + ^ +Error: Unauthorized: Unauthorized + at send (file:///home/pdxjohnny/activitypub-starter-kit-alice/src/request.ts:64:11) + at processTicksAndRejections (node:internal/process/task_queues:95:5) + at async file:///home/pdxjohnny/activitypub-starter-kit-alice/src/admin.ts:74:3 +``` + +- Generate key + - https://github.com/jakelazaroff/activitypub-starter-kit#deploying-to-production + - https://stackoverflow.com/questions/44474516/how-to-create-public-and-private-key-with-openssl/44474607#44474607 + +```console +$ openssl genrsa -out keypair.pem 4096 +$ openssl rsa -in keypair.pem -pubout -out publickey.crt +$ openssl pkcs8 -topk8 -inform PEM -outform PEM -nocrypt -in keypair.pem -out pkcs8.key +``` + +```console +$ PORT=8000 ADMIN_USERNAME=alice ADMIN_PASSWORD=alice PUBLIC_KEY=publickey.crt PRIVATE_KEY=keypair.pem npm run dev + +> dumbo@1.0.0 dev +> ts-node --esm src/index.ts + +Dumbo listening on port 8000… +POST /admin/follow/alice/localhost/8000/http 401 0 - 1.020 ms +POST /admin/create 204 - - 16.262 ms +Follow endpoint, our actor: http://localhost:8000/alice +Follow endpoint, object: http://localhost:8000/alice +Follow endpoint, uri: http://localhost:8000/@1367d6ef-78a2-4b26-a7b2-4ca0e7a79989 +GET /alice 200 611 - 1.014 ms +Error: error:1E08010C:DECODER routines::unsupported + at Object.createPrivateKey (node:internal/crypto/keys:620:12) + at send (file:///home/pdxjohnny/activitypub-starter-kit-alice/src/request.ts:39:22) + at processTicksAndRejections (node:internal/process/task_queues:95:5) + at async file:///home/pdxjohnny/activitypub-starter-kit-alice/src/admin.ts:74:3 { + library: 'DECODER routines', + reason: 'unsupported', + code: 'ERR_OSSL_UNSUPPORTED' +} +``` + +- Create post + +**post.json** + +```json +{ + "object": { + "type": "Note", + "content": "Alice is Here!" + } +} +``` + +```console +$ curl -u alice:alice -X POST --header "Content-Type: application/json" --data @post.json -v http://localhost:8000/admin/create +$ curl -u alice:alice -X POST -v http://localhost:8000/admin/follow/alice/localhost/8000/http +``` + +- ERR_OSSL_UNSUPPORTED failure +- `--openssl-legacy-provider` did not help (compile then ran) + - https://github.com/auth0/node-jsonwebtoken/issues/846#issuecomment-1361667054 + - https://stackoverflow.com/questions/69962209/what-is-openssl-legacy-provider-in-node-js-v17 +- https://nodejs.org/download/release/latest-v16.x/ + - Downgraded from nodejs 18 to 16 + +```console +$ PORT=8000 ADMIN_USERNAME=alice ADMIN_PASSWORD=alice PUBLIC_KEY=publickey.crt PRIVATE_KEY=keypair.pem npm run dev + +> dumbo@1.0.0 dev +> ts-node --esm src/index.ts + +Dumbo listening on port 8000… +Follow endpoint, our actor: http://localhost:8000/alice +Follow endpoint, object: http://localhost:8000/alice +Follow endpoint, uri: http://localhost:8000/@1bba04e4-ca3d-4f9c-84c0-924f7ee5d796 +GET /alice 200 611 - 3.711 ms +Error: error:0909006C:PEM routines:get_name:no start line + at Object.createPrivateKey (node:internal/crypto/keys:620:12) + at send (file:///home/pdxjohnny/activitypub-starter-kit-alice/src/request.ts:39:22) + at processTicksAndRejections (node:internal/process/task_queues:96:5) + at async file:///home/pdxjohnny/activitypub-starter-kit-alice/src/admin.ts:74:3 { + library: 'PEM routines', + function: 'get_name', + reason: 'no start line', + code: 'ERR_OSSL_PEM_NO_START_LINE' +} +``` + +- Perhaps a missformatted key? + - Looks like there's a PEM start lin to me, although sometimes + these things need to be find replaced from RSA to PEM ENCODED + +```console +$ cat keypair.pem +-----BEGIN RSA PRIVATE KEY----- +MIIJJwIBAAKCAgEAozVUsUl3mXxhSJbTGW8KaOrSzcx7FnZij6Qc5jRmuiGKUlQb +wHojhcwQUMkVYioVZR1hK80rKT9FXndDYpjoB6O1z92TRYBiwpz2T5VR/1oqtB2j +8ajGJbG43wuMvi3f5YYMzl7cySpzwRDCZSzAjryz7zDBwEu17d912ufUqT7TAbco +GbLx8yM0ONtIDi89WnXZNQk1C3issO2pb/n9YtAaXlrsrTeB99IY6I1G9qnq00Nk +SR2XW6R6+GDFWV2wcu61XKXvMT4g2U6HibrLLIVmWv+hPIvvLWweCNpg74gnq8DL +a/TMjkt0Q6UImuG3Iwdbg29KOdhS98MmrttRRq8ljsttwfwqqyLRZFNQuW2v1Zxw +C0BB7XomhkJgdHCIOWGeAULxRlQarlFstT6fGaNSlVbcHoKDX6j+XckF+13prsRz +WrZxM44v2zw8Yx2oh7LJKcvFdqow8TZBG+YnaO6w1Wel2+n92iaOC0oU+sgxtfBv +ECebzMM94YPB58Ja3hlbIz627Ut+v/TDXHmVjxueufw285GpSI7GmsZihcdB5eBM +IDE0UKnvNbqc+TncoTUXAIxXs7cvnEHusAmMONxtxXlRNOSfKaJ/PWkVwa3NvPrd +4oeIJWdLRppNd5mYA1i2CkPdd5lBAiMWwk2AzP5Hrjlf3/QyZe7mHQAfvjkCAwEA +AQKCAgAMj6o5CuJ9makTISiWKImwkYIv/LDshagITiU7QoU1hidTNs37/mqFfbMz +xIY0y/Bhm+VCrcPIpOn930f8arBRBjSUDwWqr7rqJ5J9hYyODq6CtlVL4CV/+TG1 +WPo4GOfGjo6lw39SrEXEcjnD97HKSpO+sQ34cttJt40mj/k6HW+4DhS0BaXBhNAc +ICRnkxMxxyg0gMToYR1JcME9sQvjcwjUgkL9Aeyq2iy4mUcA7Qr5v5IUXoarsvkT +7L7DpgZSkjS7MLo8HAADOZwM6aeBgbLjBhGuy0ZZRJz7KXOUe/IxseD2Kh0kRoJo +QmdrJDwoIgTVFNetpAyerRJpGtAidQ5SOjwPY4h0qCQ6DymHi/L3dSuKKeBdIdY+ +bq1p43K3ZwD9NSnkA7SE+cuxDyjLlTy6OYOFdP8nrpX4vLS2JaNjYZ9knA9NZWx8 +jO46lQhStQbEnfSurIeeuJre8Sdmx5sTdMS3EDOoL3K0qIG4SodX7ZmBkRw/HSSD +teV7zt3SerpdNluGsTD+ezhefr1B05pRVHnZ2mvG1XRPHbaNbcLedOxyEmUg/Ld4 +pq0yyM1zDYjtjMAw7Zr/rQ2Fdf5NJhKhm2+MWEdSi69Ag2bd6Myu6A8z9N0GVWjc +z+OHDiyZcpXwHCZpxu0OKNiPyNrYATBtSGk/ziHYaWIwfHENQQKCAQEA14neGh4L +FGrF0s19OdHT6EFweZ2+SNWgdUUWcias4dXGWnTdiDjmYhHTMLw6GjhCFGMBVGDa +5VfgDXDqE9qWiE61w3maqpnDe5OSul6midR2m/1nDPElvUIONSIoc2gy0x1cpA0k +3+lyd008Oz8JgyWBGDyykOTK4jpNFCfO6dQbv9AwfI+ibSdcDSw1e1LH3gh8AJf8 +6xlexgSrPY+A/PSA1jGWWLgeUp9vr2A9sNlHmAMzOIoOMgGXwLGBApCDERCg75es +2sOwcMlGWACpUsGe8mvb8aRtE8ZC3Foq48CqvQkWNXUG7uNmsso+O9Yx+Ipsg3xw +8eQIO8fNjXEQRQKCAQEAwdihcDsgsil0AC+kVtOeZDZkuRozhJeX0cC86Wg9quKf +MpXxrcXqucXTs6Mj55tIiKBuIqwKIoTKOm2c/I6FzmwGWfUrq9IV5doaJfaHbOfF +s9p/TucqFqMzYuSBjnDZ/W+WAYHIc7Yv5rtsbvGDBVHrGk5septi2VF+Y1xLYN5k +h5WCDJ84W53aQCmkGEJX8g784HVuNjaGCsfLS6Hu2U9p7B0GjgUPIEWNsz0Qfw22 +CUjVxMsgDJfs8+F/PgMP6dO4Bh0E5ozDjCngMcyNgujO0e5E6ENhUBvxorq2ZNs4 ++reThNb8EVpNyoB0kW54TjF1O7+xOQk2WqzvFge3ZQKCAQBIc9GzGdJDpLim6ehk +XUJMWmMCnqHuPdFYWkb3TETlDUVF9u7Y8beP08oYIc2uLqcXz0gNIxmx6l4oZoJo +9huP6lVzRhdVraZmje7Osy5sOn08ZmwTj6ROADxiY/8Q/D/Jc59GjhyNIB4YOGA8 +0i3SZfMFQLINvrrsXQi388G8HE7PpZ4G4QmKg6aPzwKTV/pTiqqUUIL2TGrtSXh+ +kxSa812zoquVWx7mSy3x1/okzoUgdkLriIzJBnwKjCB/yjAktmBC6ctzJkDTSPVa +c653YRqbBuLCUbFQ6l5jT/QG5yb9sGZExff0qYBGLXHKD3Bwyac8c8JLrYmO/tT7 +7Lu5AoIBACiUyXdNaZLiyr4fOzBSLR6dpIh7y70+XzIyP1o90Gst9lYIvge7H2C0 +4ZUB2kpqX8z6iRQJIDYJxqxktjDJRYnpY4sBoJrf6GWuOzsnWUKbYvA8FdrW2iDT +GbbiT50aUwiTi7vVB7nxsiWDpzeyp9M9SxK+yEcCsLb+MI9sivtEk5cu3YL28j17 +1m0ISqopeW/bY2U6MFB5KaaoHQ9AX1hvH6WmjfC9bmU7KmcTqZhvrmRTMy13uMXq +KFkGJDU/Pt2czTG6cYQyg92cBqtmP1ngkyuvzg0xzfWPZA7FN9n1awBR3jg5KZwY +Y6C5M64eimEUSY6wmtFt9EsXWRYrl2ECggEAFOi9VS+SLQKeOJ+X0WVsC/yx2yoS +TFYkI1NcHl3j/W6dFJGwanV+uAR6pJjt+obgJVlncuvRTK6BPxEmsxIb61T9W3uw +pAABeX3S6T05XA3v25l0zvCZiunkZbtyR/FfEGjMkls1vvDDqeSveqpU9y4YpAYL +UsszhZ3U1MXyvwO1Z7KWOl2BhVFI/zskbltcLPwYvI0xH8/OR7wrS5z3YdDj65Gr +/iBiuIYJTL8LZ8kprZB4mKTd8DGqNEJVyYQOG+RJLWW37/mm+SeAwABSfhanccVt +WNAXcit1N6u8ao3A0+kV6zR6pGLD8MxphtfdhKQeTOQG5QindbV6Opo5ug== +-----END RSA PRIVATE KEY----- +``` + +- nodejs docs + - crypto.createPrivateKey(key)# + - https://nodejs.org/api/crypto.html#cryptocreateprivatekeykey + - `key | | | | | ` + - `key: | | | | | ` + - The key material, either in PEM, DER, or JWK format. + +```console +$ PORT=8000 ADMIN_USERNAME=alice ADMIN_PASSWORD=alice PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run dev +Dumbo listening on port 8000… +Follow endpoint, our actor: http://localhost:8000/alice +Follow endpoint, object: http://localhost:8000/alice +Follow endpoint, uri: http://localhost:8000/@3eca6f29-414e-449e-9543-19f718314593 +GET /alice 200 1410 - 3.044 ms +GET /alice 200 1410 - 0.618 ms +Error: Invalid request signature. + at verify (file:///home/pdxjohnny/activitypub-starter-kit-alice/src/request.ts:128:24) + at processTicksAndRejections (node:internal/process/task_queues:96:5) + at async file:///home/pdxjohnny/activitypub-starter-kit-alice/src/activitypub.ts:51:12 +POST /alice/inbox 401 12 - 111.891 ms +file:///home/pdxjohnny/activitypub-starter-kit-alice/src/request.ts:66 + throw new Error(res.statusText + ": " + (await res.text())); + ^ +Error: Unauthorized: Unauthorized + at send (file:///home/pdxjohnny/activitypub-starter-kit-alice/src/request.ts:66:11) + at processTicksAndRejections (node:internal/process/task_queues:96:5) + at async file:///home/pdxjohnny/activitypub-starter-kit-alice/src/admin.ts:74:3 +``` + +- Update `fragment` on `activitypub.send()` + +```patch +diff --git a/src/request.ts b/src/request.ts +index 462bcbd..cad57a7 100644 +--- a/src/request.ts ++++ b/src/request.ts +@@ -31,7 +31,7 @@ export async function send(sender: string, recipient: string, message: object) { + const url = new URL(recipient); + + const actor = await fetchActor(recipient); +- const fragment = actor.inbox.replace("https://" + url.hostname, ""); ++ const fragment = url.pathname + "/inbox"; + const body = JSON.stringify(message); + const digest = crypto.createHash("sha256").update(body).digest("base64"); + const d = new Date(); +@@ -46,6 +46,7 @@ export async function send(sender: string, recipient: string, message: object) { + const signature = crypto + .sign("sha256", Buffer.from(data), key) + .toString("base64"); ++ console.log(`crypto.sign("sha256", data: ${data}, key: ${key}, signature: ${signature})`); + + const res = await fetch(actor.inbox, { + method: "POST", +@@ -119,6 +120,7 @@ export async function verify(req: Request): Promise { + return `${header}: ${req.get(header)}`; + }) + .join("\n"); ++ console.log(`crypto.verify("sha256", data: ${comparison}, key: ${key}, signature: ${included.signature})`); + const data = Buffer.from(comparison); + + // verify the signature against the headers using the actor's public key + +``` + +- Previous: https://asciinema.org/a/537643 + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md#scitt-api-emulator-spin-up + - We're going to put content addresses in both places, we'll then use the registry and proxies to serve content out of it ORAS.land style. + - Proxies can handle scratch image with manifest to memetype for example to jpeg or anything. + - SCITT will be for the receipts of ActivityPub messages (TCP handshake style) + +```console +$ dffml service dev export alice.cli:ALICE_COLLECTOR_DATAFLOW +``` + +- https://github.com/jakelazaroff/activitypub-starter-kit/pull/1 + - We've now successfully posted content and content addresses to SCITT and via ActivityPub. Forming the basis for our Thought Communication Protocol three way handshake. We've used the SHA384 sum of living threat model collector dataflow as a stand in for the content address whose content will exist in https://oras.land. ActivityPub and SCITT enable us to close the loop of vuln analysis and remediation. + - https://github.com/intel/dffml/issues/51#issuecomment-1172615272 + - Related to distributed locking and Thought Communication Protocol three way handshake + - Thank you Jake Lazaroff for https://github.com/jakelazaroff/activitypub-starter-kit! + +[![asciicast](https://asciinema.org/a/554864.svg)](https://asciinema.org/a/554864) + +- Tested with https://localhost.run based HTTPS + +```console +$ ssh -R 80:localhost:8000 nokey@localhost.run +``` + +```console +$ curl -u alice:alice -X POST -v https://9e2336258d686a.lhr.life/admin/follow/alice/9e2336258d686a.lhr.life/443/https +$ curl -u alice:alice -X POST --header "Content-Type: application/json" --data @post.json -v https://9e2336258d686a.lhr.life/admin/create +``` + +- https://asciinema.org/a/554872 +- https://asciinema.org/a/554875 +- TODO + - [ ] Downstream validation via activitypub, regisrty, cve bin tool and trivy for sbom vex scitt for registry recipts + - [ ] Status update video \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0160/index.md b/docs/discussions/alice_engineering_comms/0160/index.md new file mode 100644 index 0000000000..5edc7a5d87 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0160/index.md @@ -0,0 +1 @@ +# 2023-01-27 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0160/reply_0000.md b/docs/discussions/alice_engineering_comms/0160/reply_0000.md new file mode 100644 index 0000000000..0c58dcda2e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0160/reply_0000.md @@ -0,0 +1,803 @@ +## 2023-01-27 @pdxjohnny Engineering Logs + +```console +$ ssh -R 80:localhost:8000 nokey@localhost.run & +8c0fe6b82d8db0.lhr.life tunneled with tls termination, https://8c0fe6b82d8db0.lhr.life/ +$ openssl genrsa -out keypair.pem 4096 && openssl rsa -in keypair.pem -pubout -out publickey.crt && openssl pkcs8 -topk8 -inform PEM -outform PEM -nocrypt -in keypair.pem -out pkcs8.key +$ FDQN=8c0fe6b82d8db0.lhr.life PORT=8000 ADMIN_USERNAME=alice ADMIN_PASSWORD=alice PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run dev +$ curl -u alice:alice -X POST -v https://8c0fe6b82d8db0.lhr.life/admin/follow/alice/8c0fe6b82d8db0.lhr.life/443/https +$ curl -u alice:alice -X POST --header "Content-Type: application/json" --data @post.json -v https://8c0fe6b82d8db0.lhr.life/admin/create +``` + +- https://asciinema.org/a/554880 + - localhost.run to test with HTTPS + - Success! + - https://github.com/pdxjohnny/activitypub-starter-kit/commit/871ddad4ee774e4452b71075350fde723fe090f7 +- https://goharbor.io/docs/2.7.0/install-config/download-installer/ + +![image](https://user-images.githubusercontent.com/5950433/215056574-8eb9ae89-f395-4381-8573-6a4b7a15ed67.png) + +![image](https://user-images.githubusercontent.com/5950433/215056602-032f6068-e6b7-416b-b029-603106b68c74.png) + +![image](https://user-images.githubusercontent.com/5950433/215057033-8cc8f889-2fcf-4736-898c-1d85612bd98c.png) + +- https://github.com/jakelazaroff/activitypub-starter-kit/pull/2 +- Alice's first post has federated it's way on over to mastodon.social! + - https://mastodon.social/@alice@70739a422394f5.lhr.life/109760532115001430 +- https://github.com/distribution/distribution +- We have the basis for our distributed stream of consciousness + - We'll work to move from federation to true decentralization ASAP + - https://areweweb5yet.com/ - 51% +- What do we want now? + - Register webhooks for GitHub and Harbor or ORAS.land + - GitHub + - Push event + - We want to know when Dockerfiles change so we can analyze them and dispatch any downstream workflows. + - Everything else + - Proxy to ActivityPub notes + - Container registry + - Image pushed + - Create ActivityPub note with version and content address + - We can leverage the 0010-Schema ADR to make posts json manifest instances +- https://github.com/digitalocean/sample-nodejs +- https://github.com/digitalocean/sample-websocket/blob/main/.do/app.yaml +- https://goharbor.io/docs/2.7.0/install-config/installation-prereqs/ + - Spun up VM with minimum requirements ($12/month on DO, will move to DevCloud later with ephemeral infra) +- DNS nameservers updated to DO + - dffml.registry.chadig.com is correctly resolving + - Confirmed via `dig` + +```console +$ dig dffml.registry.chadig.com + +; <<>> DiG 9.18.8 <<>> dffml.registry.chadig.com +;; global options: +cmd +;; Got answer: +;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 9790 +;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 1 + +;; OPT PSEUDOSECTION: +; EDNS: version: 0, flags:; udp: 65494 +;; QUESTION SECTION: +;dffml.registry.chadig.com. IN A + +;; ANSWER SECTION: +dffml.registry.chadig.com. 3600 IN A 143.244.181.104 + +;; Query time: 68 msec +;; SERVER: 127.0.0.53#53(127.0.0.53) (UDP) +;; WHEN: Fri Jan 27 02:39:06 PST 2023 +;; MSG SIZE rcvd: 70 +``` + +- https://github.com/mholt/caddy-l4 + - Forgot about this, layer 4 ssh proxing for caddy +- https://caddyserver.com/docs/quick-starts/reverse-proxy +- https://caddyserver.com/docs/command-line#caddy-reverse-proxy +- Create `alice` user, download caddy for auto https + +```console +[root@prophecy-0 ~]# curl -fLo caddy "https://caddyserver.com/api/download?os=linux&arch=amd64" +[root@prophecy-0 ~]# chmod 755 caddy +[root@prophecy-0 ~]# mv caddy /usr/bin/caddy +[root@prophecy-0 ~]# setcap CAP_NET_BIND_SERVICE=+eip /usr/bin/caddy +[root@prophecy-0 ~]# dnf module install -y tmux nodejs:16 +[root@prophecy-0 ~]# useradd -m -s $(which bash) alice +[root@prophecy-0 ~]# su alice +``` + +- Download and compile the activitypub server +- https://github.com/pdxjohnny/activitypub-starter-kit/commit/be9be9bf8e307c36a09e80ed96579bd436d01e73 + +```console +[alice@prophecy-0 ~]$ tmux +[alice@prophecy-0 ~]$ curl -sfL https://github.com/pdxjohnny/activitypub-starter-kit/archive/refs/heads/alternate_port.tar.gz | tar xvz +[alice@prophecy-0 ~]$ cd activitypub-starter-kit-alternate_port/ +[alice@prophecy-0 ~]$ npm install +[alice@prophecy-0 ~]$ npm run build +[alice@prophecy-0 ~]$ head -n 10000 /dev/urandom | sha384sum | awk '{print $1}' | tee ../password +[alice@prophecy-0 ~]$ head -n 10000 /dev/urandom | sha384sum | awk '{print $1}' | tee ../webhook +[alice@prophecy-0 ~]$ openssl genrsa -out keypair.pem 4096 && openssl rsa -in keypair.pem -pubout -out publickey.crt && openssl pkcs8 -topk8 -inform PEM -outform PEM -nocrypt -in keypair.pem -out pkcs8.key +[alice@prophecy-0 ~]$ cat .env <<'EOF' +# The Node environment +NODE_ENV="production" + +# The path to the database schema +SCHEMA_PATH="db/schema.sql" + +# The path to the database file +DATABASE_PATH="db/database.sqlite3" + +# The hostname (i.e. the "example.com" part of https://example.com/alice) +HOSTNAME="prophecy.chadig.com" + +# The account name (i.e. the "alice" part of https://example.com/alice) +ACCOUNT="alice" +EOF +[alice@prophecy-0 ~]$ FDQN=prophecy.chadig.com WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=alice ADMIN_USERNAME=alice ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run start +``` + +- Now run the reverse proxy in another tmux pane (eventually auto start with systemd based off image to VM builds) + +```console +[alice@prophecy-0 ~]$ caddy reverse-proxy --from https://prophecy.chadig.com --to :8000 +2023/01/27 11:38:17.564 WARN admin admin endpoint disabled +2023/01/27 11:38:17.566 INFO http server is listening only on the HTTPS port but has no TLS connection policies; adding one to enable TLS {"server_name": "proxy", "https_port": 443} +2023/01/27 11:38:17.567 INFO http enabling automatic HTTP->HTTPS redirects {"server_name": "proxy"} +2023/01/27 11:38:17.568 INFO http enabling HTTP/3 listener {"addr": ":443"} +2023/01/27 11:38:17.569 INFO failed to sufficiently increase receive buffer size (was: 208 kiB, wanted: 2048 kiB, got: 416 kiB). See https://github.com/lucas-clemente/quic-go/wiki/UDP-Receive-Buffer-Size for details. +2023/01/27 11:38:17.569 INFO http.log server running {"name": "proxy", "protocols": ["h1", "h2", "h3"]} +2023/01/27 11:38:17.570 INFO http.log server running {"name": "remaining_auto_https_redirects", "protocols": ["h1", "h2", "h3"]} +2023/01/27 11:38:17.571 INFO http enabling automatic TLS certificate management {"domains": ["prophecy.chadig.com"]} +Caddy proxying https://prophecy.chadig.com -> :8000 +2023/01/27 11:38:17.572 INFO tls.obtain acquiring lock {"identifier": "prophecy.chadig.com"} +2023/01/27 11:38:17.578 INFO tls.obtain lock acquired {"identifier": "prophecy.chadig.com"} +2023/01/27 11:38:17.579 INFO tls.obtain obtaining certificate {"identifier": "prophecy.chadig.com"} +2023/01/27 11:38:17.584 INFO tls.cache.maintenance started background certificate maintenance {"cache": "0xc00013eee0"} +2023/01/27 11:38:17.586 INFO tls cleaning storage unit {"description": "FileStorage:/home/alice/.local/share/caddy"} +2023/01/27 11:38:17.586 INFO tls finished cleaning storage units +2023/01/27 11:38:17.832 INFO http waiting on internal rate limiter {"identifiers": ["prophecy.chadig.com"], "ca": "https://acme-v02.api.letsencrypt.org/directory", "account": ""} +2023/01/27 11:38:17.833 INFO http done waiting on internal rate limiter {"identifiers": ["prophecy.chadig.com"], "ca": "https://acme-v02.api.letsencrypt.org/directory", "account": ""} +2023/01/27 11:38:17.926 INFO http.acme_client trying to solve challenge {"identifier": "prophecy.chadig.com", "challenge_type": "tls-alpn-01", "ca": "https://acme-v02.api.letsencrypt.org/directory"} +2023/01/27 11:38:18.070 INFO tls served key authentication certificate {"server_name": "prophecy.chadig.com", "challenge": "tls-alpn-01", "remote": "54.244.41.23:38056", "distributed": false} +2023/01/27 11:38:18.157 INFO tls served key authentication certificate {"server_name": "prophecy.chadig.com", "challenge": "tls-alpn-01", "remote": "23.178.112.106:16466", "distributed": false} +2023/01/27 11:38:18.201 INFO tls served key authentication certificate {"server_name": "prophecy.chadig.com", "challenge": "tls-alpn-01", "remote": "18.224.32.186:23554", "distributed": false} +2023/01/27 11:38:18.530 INFO http.acme_client authorization finalized {"identifier": "prophecy.chadig.com", "authz_status": "valid"} +2023/01/27 11:38:18.532 INFO http.acme_client validations succeeded; finalizing order {"order": "https://acme-v02.api.letsencrypt.org/acme/order/936031817/161295115697"} +2023/01/27 11:38:18.943 INFO http.acme_client successfully downloaded available certificate chains {"count": 2, "first_url": "https://acme-v02.api.letsencrypt.org/acme/cert/03b13046a47a2e95fe2496fc4d8c64aac8d0"} +2023/01/27 11:38:18.945 INFO tls.obtain certificate obtained successfully {"identifier": "prophecy.chadig.com"} +2023/01/27 11:38:18.946 INFO tls.obtain releasing lock {"identifier": "prophecy.chadig.com" +``` + +![Screenshot from 2023-01-27 03-40-30](https://user-images.githubusercontent.com/5950433/215078120-ae508beb-ba70-410c-b2ca-0cc1b193a30a.png) + +- https://mastodon.social/@alice@prophecy.chadig.com +- https://github.com/intel/dffml/issues/1247#issuecomment-1371317321 + - Now in webhook beta so should be able to test via CLI + - https://docs.github.com/en/developers/webhooks-and-events/webhooks/webhook-events-and-payloads?actionType=edited#discussion_comment + - https://docs.github.com/developers/webhooks-and-events/webhooks/webhook-events-and-payloads#push + +```console +$ gh webhook forward --repo=intel/dffml --events=discussion_comment --url=http://localhost:8000/webhook/$(cat ../webhook) & +Forwarding Webhook events from GitHub... + +$ rm -f db/database.sqlite3 +$ PROTO=http FDQN=localhost:8000 WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=alice ADMIN_USERNAME=alice ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run dev & +Dumbo listening on port 8000… +POST /webhook/b7ad8661a006195b317985d922b2ff37ebe8beac9a8f9cfe4ba0a177848c5e96e75ff926de82e87943ea79dca533cdc7 204 - - 13.781 ms +GET /alice/outbox 200 40582 - 2.251 ms +$ curl -s http://localhost:8000/alice/outbox | python -c 'import yaml, json, sys; print(yaml.dump(json.load(sys.stdin)))' +``` + +- It's alive! :) + +```yaml +'@context': https://www.w3.org/ns/activitystreams +id: http://localhost:8000/alice/outbox +orderedItems: +- '@context': https://www.w3.org/ns/activitystreams + actor: http://localhost:8000/alice + cc: [] + id: http://localhost:8000/alice/posts/48c61646-1538-471b-92e1-4d30a7337336 + object: + attributedTo: http://localhost:8000/alice + cc: + - http://localhost:8000/alice/followers + content: "{\"action\":\"edited\",\"comment\":{\"id\":4794771,\"node_id\":\"DC_kwDOCOlgGM4ASSmT\"\ + ,\"html_url\":\"https://github.com/intel/dffml/discussions/1406#discussioncomment-4794771\"\ + ,\"parent_id\":4794098,\"child_comment_count\":0,\"repository_url\":\"intel/dffml\"\ + ,\"discussion_id\":4225995,\"author_association\":\"MEMBER\",\"user\":{\"login\"\ + :\"pdxjohnny\",\"id\":5950433,\"node_id\":\"MDQ6VXNlcjU5NTA0MzM=\",\"avatar_url\"\ + :\"https://avatars.githubusercontent.com/u/5950433?v=4\",\"gravatar_id\":\"\"\ + ,\"url\":\"https://api.github.com/users/pdxjohnny\", + :\"https://api.github.com/users/pdxjohnny/gists{/gist_id}\",\"starred_url\"\ + ,\"type\":\"User\",\"site_admin\":false}}" + id: http://localhost:8000/alice/post/58688c80-f982-4dc0-a676-34c955c4a4cd + published: '2023-01-27T17:49:23.949Z' + to: + - https://www.w3.org/ns/activitystreams#Public + type: Note + published: '2023-01-27T17:49:23.000Z' + to: + - https://www.w3.org/ns/activitystreams#Public + type: Create +totalItems: 1 +type: OrderedCollection +``` + +- https://stedolan.github.io/jq/manual/ +- https://stackoverflow.com/questions/38061346/jq-output-array-of-json-objects + +**schema/alice/shouldi/contribute/0.0.0.schema.json** + +```json +{ + "$id": "https://github.com/intel/dffml/raw/alice/schema/alice/shouldi/contribute/0.0.0.schema.json", + "$schema": "https://json-schema.org/draft/2020-12/schema", + "description": "Schema for Alice Should I Contribute? Gatekeeper", + "properties": { + "$schema": { + "type": "string" + }, + "community_health_check": { + "description": "Community Health Check", + "$ref": "#/definitions/community_health_check" + }, + }, + "additionalProperties": false, + "required": [ + "$schema", + "community_health_check" + ] + "definitions": { + "community_health_check": { + "type": "object", + "properties": { + "has_support": { + "description": "FileSupportPresent", + "type": "boolean", + "enum": [true] + }, + }, + "additionalProperties": false, + "required": [ + "has_support" + ] + } + } +} +``` + +- Playing with output operation as schema validation to assist with data model alignment + +```console +$ alice shouldi contribute -keys https://github.com/pdxjohnny/httptest | tee dffml_list_records_stdout.json +[████████████░░░░░░░░░░░░░░░░░░░░░░░░░░░░] Running CodeNarc for 29s +``` + +**dffml_list_records_stdout.json** + +```json +[ + { + "extra": {}, + "features": { + "ActionsValidatorBinary": [], + "CodeNarcServerProc": [], + "FileCodeOfConductPresent": [ + false + ], + "FileContributingPresent": [ + false + ], + "FileReadmePresent": [ + true + ], + "FileSecurityPresent": [ + false + ], + "FileSupportPresent": [ + false + ], + "GitHubActionsWorkflowUnixStylePath": [ + ".github/workflows/tests.yml", + ".github/workflows/release.yml" + ], + "HasDocs": [ + { + "example": false, + "known issues": false, + "readme_present": true, + "support": true, + "usage": true + } + ], + "JavaBinary": [], + "NPMGroovyLintCMD": [], + "NPMGroovyLintResult": [ + { + "files": {}, + "summary": { + "detectedRules": {}, + "fixedRules": {}, + "totalFilesLinted": 0, + "totalFilesWithErrorsNumber": 0, + "totalFixedErrorNumber": 0, + "totalFixedInfoNumber": 0, + "totalFixedNumber": 0, + "totalFixedWarningNumber": 0, + "totalFoundErrorNumber": 0, + "totalFoundInfoNumber": 0, + "totalFoundNumber": 0, + "totalFoundWarningNumber": 0, + "totalRemainingErrorNumber": 0, + "totalRemainingInfoNumber": 0, + "totalRemainingNumber": 0, + "totalRemainingWarningNumber": 0 + } + } + ], + "RepoDirectory": [ + "/tmp/dffml-feature-git-zcv0u_6h" + ], + "URL": [], + "author_count": [ + 0 + ], + "author_line_count": [ + {} + ], + "commit_count": [ + 0 + ], + "date": [ + "2023-01-27 19:15" + ], + "date_pair": [ + [ + "2023-01-27 19:15", + "2022-10-27 19:15" + ] + ], + "git_branch": [], + "git_commit": [ + "0486a73dcadafbb364c267e5e5d0161030682599" + ], + "git_remote": [], + "git_repository": [], + "git_repository_checked_out": [ + { + "URL": "https://github.com/pdxjohnny/httptest", + "commit": "0486a73dcadafbb364c267e5e5d0161030682599", + "directory": "/tmp/dffml-feature-git-zcv0u_6h" + } + ], + "quarter": [], + "quarter_start_date": [], + "release_within_period": [ + false + ], + "str": [], + "valid_git_repository_URL": [], + "work_spread": [ + 0 + ] + }, + "key": "https://github.com/pdxjohnny/httptest", + "last_updated": "2023-01-27T19:16:37Z" + } +] +``` + +```console +$ jq '.[].features | {repo_url: .git_repository_checked_out[0].URL, community_health_check: {has_support: (if .FileSupportPresent then .FileSupportPresent[0] else false end)}}' dffml_list_records_stdout.json | jq -s +[ + { + "repo_url": "https://github.com/pdxjohnny/httptest", + "community_health_check": { + "has_support": false + } + } +] +``` + +- https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md +- https://github.com/intel/dffml/blob/alice/docs/arch/0010-Schema.rst + +```console +$ jsonschema --instance <(jq '.[].features | {repo_url: .git_repository_checked_out[0].URL, community_health_check: {has_support: (if .FileSupportPresent then .FileSupportPresent[0] else false end)}}' dffml_list_records_stdout.json | jq -s | jq '.[0]') 0.0.0.schema.json +False: False is not one of [True] +{'repo_url': 'https://github.com/pdxjohnny/httptest', 'community_health_check': {'has_support': False}}: Additional properties are not allowed ('repo_url' was unexpected) +{'repo_url': 'https://github.com/pdxjohnny/httptest', 'community_health_check': {'has_support': False}}: '$schema' is a required property +``` + +- We can leverage the GitHub CLI webhook proxy to bypass static registration + - We can have periodically scheduled jobs on runners we add which just sit and translate + - [![hack-the-planet](https://img.shields.io/badge/hack%20the-planet-blue)](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md#hack-the-planet-) +- https://stackoverflow.com/questions/22429744/how-to-setup-route-for-websocket-server-in-express + - Looking to decouple listening for events via websocket + - https://github.com/vi/websocat + - https://github.com/websockets/ws#server-broadcast + - https://github.com/websockets/ws#how-to-detect-and-close-broken-connections + - https://github.com/websockets/ws#client-authentication + - https://github.com/LionC/express-basic-auth +- https://github.com/jakelazaroff/activitypub-starter-kit/commit/ca1ac728af3eaa1cc8f7f0af201e398bc6a1b3ec + - Basic no auth Websocket inbox rebroadcast to connected clients + +```console +$ curl -fLo websocat https://github.com/vi/websocat/releases/download/v1.11.0/websocat.x86_64-unknown-linux-musl +$ chmod 755 websocat +$ mv websocat ~/.bin/ +$ websocat --exit-on-eof ws://localhost:8000/listen/websocket & +{"@context":"https://www.w3.org/ns/activitystreams","type":"Create","published":"2023-01-28T00:06:07.286Z","actor":"http://localhost:8000/alice","to":["https://www.w3.org/ns/activitystreams#Public"],"cc":["http://localhost:8000/alice"],"object":{"attributedTo":"http://localhost:8000/alice","published":"2023-01-28T00:06:07.286Z","to":["https://www.w3.org/ns/activitystreams#Public"],"cc":["http://localhost:8000/alice/followers"],"type":"Note","content":"Alice is Here!","id":"http://localhost:8000/alice/post/493e970e-ca9f-43ce-97e3-453c6677ecf0"},"id":"http://localhost:8000/alice/post/3ed6a6f4-4da0-4386-9faf-6eaec0d83240"} +$ curl -u alice:$(cat ../password) -X POST -v http://localhost:8000/admin/follow/alice/localhost/8000/http +$ curl -u alice:$(cat ../password) -X POST --header "Content-Type: application/json" --data @post.json -v http://localhost:8000/admin/create +``` + +- Success, now to add auth to WebSocket connection + - https://spdx.dev/ids/ + - https://github.com/LionC/express-basic-auth/blob/dd17b4de9fee9558269cdc583310bde5331456e7/index.js#L1-L17 + - https://github.com/jshttp/basic-auth#example + - https://stackoverflow.com/questions/63552689/how-to-deal-with-server-handleupgrade-was-called-more-than-once-in-nodejs + +**post.json** + +```json +{ + "object": { + "type": "Note", + "content": "Alice is Here!" + } +} +``` + +```console +$ rm -f db/database.sqlite3; PROTO=http HOSTNAME=localhost WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=alice ADMIN_USERNAME=alice ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run dev & +$ websocat --exit-on-eof ws://localhost:8000/listen/websocket +websocat: WebSocketError: WebSocketError: Received unexpected status code (401 Unauthorized) +websocat: error running +$ websocat --exit-on-eof --basic-auth alice:alice ws://localhost:8000/listen/websocket +websocat: WebSocketError: WebSocketError: Received unexpected status code (401 Unauthorized) +websocat: error running +$ websocat --exit-on-eof --basic-auth alice:$(cat ../password) ws://localhost:8000/listen/websocket & +{"@context":"https://www.w3.org/ns/activitystreams","id":"http://localhost:8000/a0265dc0-e781-4f5b-89dd-0e1c36454a37","type":"Accept","actor":"http://localhost:8000/alice","object":{"@context":"https://www.w3.org/ns/activitystreams","id":"http://localhost:8000/@914e5adf-e47d-4c2a-a4be-48546081b6be","type":"Follow","actor":"http://localhost:8000/alice","object":"http://localhost:8000/alice"}} +{"@context":"https://www.w3.org/ns/activitystreams","id":"http://localhost:8000/@914e5adf-e47d-4c2a-a4be-48546081b6be","type":"Follow","actor":"http://localhost:8000/alice","object":"http://localhost:8000/alice"} +{"@context":"https://www.w3.org/ns/activitystreams","type":"Create","published":"2023-01-28T00:52:56.799Z","actor":"http://localhost:8000/alice","to":["https://www.w3.org/ns/activitystreams#Public"],"cc":["http://localhost:8000/alice"],"object":{"attributedTo":"http://localhost:8000/alice","published":"2023-01-28T00:52:56.799Z","to":["https://www.w3.org/ns/activitystreams#Public"],"cc":["http://localhost:8000/alice/followers"],"type":"Note","content":"Alice is Here!","id":"http://localhost:8000/alice/post/3479f6f3-5d8c-48e0-96ea-626760fb6388"},"id":"http://localhost:8000/alice/post/2afd800d-07a6-402e-8585-873e3989ba5e"} +$ curl -u alice:$(cat ../password) -X POST -v http://localhost:8000/admin/follow/alice/localhost/8000/http +$ curl -u alice:$(cat ../password) -X POST --header "Content-Type: application/json" --data @post.json -v http://localhost:8000/admin/create +``` + +- We have liftoff on WebSocket auth! + - https://github.com/jakelazaroff/activitypub-starter-kit/commit/4e8f9f541bffabe6ab5b0ffe1206d1d9337b5185 +- With the account following itself all listeners connected to `/listen/websocket` + will be notified when the account sent an post. + - Friends, today is a great day! 🛤️ +- Playing with data in websocket listener stream + +```console +$ websocat --exit-on-eof --basic-auth alice:$(cat ../password) ws://localhost:8000/listen/websocket | jq --unbuffered -r . +``` + +```json +{ + "@context": "https://www.w3.org/ns/activitystreams", + "id": "http://localhost:8000/8f82f22b-28b9-4e16-9c88-9891922253b1", + "type": "Accept", + "actor": "http://localhost:8000/alice", + "object": { + "@context": "https://www.w3.org/ns/activitystreams", + "id": "http://localhost:8000/@51e24f61-e594-4cbd-87e1-c6e121e79a2a", + "type": "Follow", + "actor": "http://localhost:8000/alice", + "object": "http://localhost:8000/alice" + } +} +{ + "@context": "https://www.w3.org/ns/activitystreams", + "id": "http://localhost:8000/@51e24f61-e594-4cbd-87e1-c6e121e79a2a", + "type": "Follow", + "actor": "http://localhost:8000/alice", + "object": "http://localhost:8000/alice" +} +{ + "@context": "https://www.w3.org/ns/activitystreams", + "type": "Create", + "published": "2023-01-28T01:24:04.873Z", + "actor": "http://localhost:8000/alice", + "to": [ + "https://www.w3.org/ns/activitystreams#Public" + ], + "cc": [ + "http://localhost:8000/alice" + ], + "object": { + "attributedTo": "http://localhost:8000/alice", + "published": "2023-01-28T01:24:04.873Z", + "to": [ + "https://www.w3.org/ns/activitystreams#Public" + ], + "cc": [ + "http://localhost:8000/alice/followers" + ], + "type": "Note", + "content": "Alice is Here!", + "id": "http://localhost:8000/alice/posts/ac466e40-a7ac-4815-963f-fc419b821f74" + }, + "id": "http://localhost:8000/alice/posts/78118a66-52a4-402d-ad2e-b6ae79997f57" +} +``` + +- When querying URLs published found that `post/` should be `posts/` + - https://www.w3.org/TR/activitypub/ + - https://github.com/jakelazaroff/activitypub-starter-kit/commit/3999fc0f722168b98f6f28fcb2d8521ca600d53e +- Example of resolving each post received from any followed account (could do this with content address within body) + - https://unix.stackexchange.com/questions/435413/using-jq-within-pipe-chain-produces-no-output + +```console +$ websocat --exit-on-eof --basic-auth alice:$(cat ../password) ws://localhost:8000/listen/websocket | jq --unbuffered -r .object.id | xargs -l -I '{}' -- sh -c "curl -sfL '{}' | jq -r" & +{ + "id": "http://localhost:8000/alice/posts/b60924b2-e1dd-4bf1-92bd-a374623064ba", + "contents": "{\"attributedTo\":\"http://localhost:8000/alice\",\"published\":\"2023-01-28T01:28:24.336Z\",\"to\":[\"https://www.w3.org/ns/activitystreams#Public\"],\"cc\":[\"http://localhost:8000/alice/followers\"],\"type\":\"Note\",\"content\":\"Alice is Here!\"}", + "created_at": "2023-01-28 01:28:24", + "createdAt": "2023-01-28T01:28:24.000Z" +} +$ curl -u alice:$(cat ../password) -X POST --header "Content-Type: application/json" --data @post.json -v http://localhost:8000/admin/create +``` + +- Playing with streaming to YAML for readability + +```console +$ websocat --exit-on-eof --basic-auth alice:$(cat ../password) ws://localhost:8000/listen/websocket | python -uc "import sys, pathlib, json, yaml; list(map(print, map(yaml.dump, map(json.loads, sys.stdin))))" +``` + +```yaml +'@context': https://www.w3.org/ns/activitystreams +actor: http://localhost:8000/alice +cc: +- http://localhost:8000/alice +id: http://localhost:8000/alice/posts/ec323059-2b02-49d8-99fe-4f6518f19c95 +object: + attributedTo: http://localhost:8000/alice + cc: + - http://localhost:8000/alice/followers + content: Alice is Here! + id: http://localhost:8000/alice/posts/018b199a-5131-40ef-9862-0d989f3ec636 + published: '2023-01-28T01:32:46.989Z' + to: + - https://www.w3.org/ns/activitystreams#Public + type: Note +published: '2023-01-28T01:32:46.989Z' +to: +- https://www.w3.org/ns/activitystreams#Public +type: Create + +'@context': https://www.w3.org/ns/activitystreams +actor: http://localhost:8000/alice +cc: +- http://localhost:8000/alice +id: http://localhost:8000/alice/posts/d3e1b1e8-bf77-4005-8f29-fab3bc2c6670 +object: + attributedTo: http://localhost:8000/alice + cc: + - http://localhost:8000/alice/followers + content: Alice is Here! + id: http://localhost:8000/alice/posts/7dcfed5a-5236-4fc2-91a7-bfe79b8540ba + published: '2023-01-28T01:32:49.343Z' + to: + - https://www.w3.org/ns/activitystreams#Public + type: Note +published: '2023-01-28T01:32:49.343Z' +to: +- https://www.w3.org/ns/activitystreams#Public +type: Create +``` + +```console +$ curl -s http://localhost:8000/alice | jq -r +``` + +```json +{ + "@context": [ + "https://www.w3.org/ns/activitystreams", + "https://w3id.org/security/v1" + ], + "id": "http://localhost:8000/alice", + "type": "Person", + "preferredUsername": "alice", + "inbox": "http://localhost:8000/alice/inbox", + "outbox": "http://localhost:8000/alice/outbox", + "followers": "http://localhost:8000/alice/followers", + "following": "http://localhost:8000/alice/following", + "publicKey": { + "id": "http://localhost:8000/alice#main-key", + "owner": "http://localhost:8000/alice", + "publicKeyPem": "-----BEGIN PUBLIC KEY-----\nMIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAozVUsUl3mXxhSJbTGW8K\naOrSzcx7FnZij6Qc5jRmuiGKUlQbwHojhcwQUMkVYioVZR1hK80rKT9FXndDYpjo\nB6O1z92TRYBiwpz2T5VR/1oqtB2j8ajGJbG43wuMvi3f5YYMzl7cySpzwRDCZSzA\njryz7zDBwEu17d912ufUqT7TAbcoGbLx8yM0ONtIDi89WnXZNQk1C3issO2pb/n9\nYtAaXlrsrTeB99IY6I1G9qnq00NkSR2XW6R6+GDFWV2wcu61XKXvMT4g2U6HibrL\nLIVmWv+hPIvvLWweCNpg74gnq8DLa/TMjkt0Q6UImuG3Iwdbg29KOdhS98MmrttR\nRq8ljsttwfwqqyLRZFNQuW2v1ZxwC0BB7XomhkJgdHCIOWGeAULxRlQarlFstT6f\nGaNSlVbcHoKDX6j+XckF+13prsRzWrZxM44v2zw8Yx2oh7LJKcvFdqow8TZBG+Yn\naO6w1Wel2+n92iaOC0oU+sgxtfBvECebzMM94YPB58Ja3hlbIz627Ut+v/TDXHmV\njxueufw285GpSI7GmsZihcdB5eBMIDE0UKnvNbqc+TncoTUXAIxXs7cvnEHusAmM\nONxtxXlRNOSfKaJ/PWkVwa3NvPrd4oeIJWdLRppNd5mYA1i2CkPdd5lBAiMWwk2A\nzP5Hrjlf3/QyZe7mHQAfvjkCAwEAAQ==\n-----END PUBLIC KEY-----" + } +} +``` + +- https://www.w3.org/wiki/SocialCG/ActivityPub/MediaUpload +- https://www.w3.org/TR/activitystreams-vocabulary/#dfn-person + - Person inherits from Object + - https://www.w3.org/TR/activitystreams-vocabulary/#dfn-summary + - Object has property summary, which is currently not present in `Person` for user `alice`. +- https://www.rfc-editor.org/rfc/rfc9116#name-contact + - Proposed extension + - `Contact: https://example.com/security-contact.html` + - In the event that the contact URL is resolve to an ActivityPub `Person` + - https://example.social/@security-contact-handle + - In the event that the `summary` field for that `Person` is a manifest + - https://github.com/intel/dffml/raw/alice/schema/security/vulnerability-disclosure-program/0.0.0.schema.json + - Fulfill the manifest in alignment with it's ADR as needed within context + - Possible fields in sketch below + - Later option of DIDs instead of ActivityPub + - Consider TOML option since easier to parse + - https://toml.io/en/ + +**schema/security/vulnerability-disclosure-program/example-pass.yaml** + +```yaml +$schema: "https://github.com/intel/dffml/raw/alice/schema/security/vulnerability-disclosure-program/0.0.0.schema.json" +sbom_feed: +- "https://example.social/@security-contact-handle-sbom-feed" +vex_feed: +- "https://example.social/@security-contact-handle-vex-feed" +vdr_feed: +- "https://example.social/@security-contact-handle-vdx-feed" +vcs_feed: +- "https://example.social/@dev-contact-handle-git-feed" +deploy_feed: +- "https://example.social/@ops-contact-handle-deploy-feed" +``` + +- We could have these be `attachement`s to the `Person` which our server is acting on behalf of. +- `sbom_feed`s can be downstream of `FROM` rebuild streams. + - #1426 +- Expand on ActivityPub spec EXAMPLE 4 `inReplyTo` to and addition of a `/admin/reply` `POST` handler to reply to a `Note` + +> ```json +> {"@context": "https://www.w3.org/ns/activitystreams", +> "type": "Create", +> "id": "https://chatty.example/ben/p/51086", +> "to": ["https://social.example/alyssa/"], +> "actor": "https://chatty.example/ben/", +> "object": {"type": "Note", +> "id": "https://chatty.example/ben/p/51085", +> "attributedTo": "https://chatty.example/ben/", +> "to": ["https://social.example/alyssa/"], +> "inReplyTo": "https://social.example/alyssa/posts/49e2d03d-b53a-4c4c-a95c-94a6abf45a19", +> "content": "

Argh, yeah, sorry, I'll get it back to you tomorrow.

+>

I was reviewing the section on register machines, +> since it's been a while since I wrote one.

"}} +> ``` + +- Bailing on this for now and just going to spin a separate subdomain feed for webhooks -> vcs feed + - `@push@git.vcs.dffml.org` + +```patch +diff --git a/src/activitypub.ts b/src/activitypub.ts +index a6a90e4..a89b67e 100644 +--- a/src/activitypub.ts ++++ b/src/activitypub.ts +@@ -12,7 +12,7 @@ import { + listPosts, + updateFollowing, + } from "./db.js"; +-import { HOSTNAME, PORT, ACCOUNT, PUBLIC_KEY, PROTO, FDQN } from "./env.js"; ++import { HOSTNAME, PORT, ACCOUNT, PUBLIC_KEY, PROTO, FDQN, SECURITY_TXT_CONTACT_VSC_FEED } from "./env.js"; + import { send, verify } from "./request.js"; + + export const activitypub = Router(); +@@ -168,6 +168,15 @@ activitypub.get("/:actor/following", async (req, res) => { + }); + }); + ++const security_txt_contact_vsc_feed_note = createPost({ ++ attributedTo: actor, ++ published: date.toISOString(), ++ to: ["https://www.w3.org/ns/activitystreams#Public"], ++ cc: [`${actor}/followers`], ++ type: "Note", ++ content: (SECURITY_TXT_CONTACT_VSC_FEED !== null) ? SECURITY_TXT_CONTACT_VSC_FEED : "N/A", ++}); ++ + activitypub.get("/:actor", async (req, res) => { + const actor: string = req.app.get("actor"); + +@@ -185,11 +194,15 @@ activitypub.get("/:actor", async (req, res) => { + outbox: `${actor}/outbox`, + followers: `${actor}/followers`, + following: `${actor}/following`, ++ summary: ``, + publicKey: { + id: `${actor}#main-key`, + owner: actor, + publicKeyPem: PUBLIC_KEY, + }, ++ attachment: [ ++ security_txt_contact_vsc_feed_note, ++ ] + }); + }); + +diff --git a/src/env.ts b/src/env.ts +index 3d1eb0f..ae24b27 100644 +--- a/src/env.ts ++++ b/src/env.ts +@@ -4,6 +4,7 @@ import dotenv from "dotenv"; + + dotenv.config(); + ++export const SECURITY_TXT_CONTACT_VSC_FEED = process.env.SECURITY_TXT_CONTACT_VSC_FEED || null; + export const WEBHOOK_PATH = process.env.WEBHOOK_PATH || "webhook"; + export const FDQN = process.env.FDQN || null; + export const PROTO = process.env.PROTO || "https"; +``` + +- Start `SECURITY_TXT_CONTACT_VSC_FEED` at known location + - Update `security.txt` in repo with `SECURITY_TXT_CONTACT_VSC_FEED` as + the `Contact` URL. +- Analysis of repo with `security.txt` pointed to `SECURITY_TXT_CONTACT_VSC_FEED` + - If the repo is a dependency of a downstream repo we care about. + - We care about it if resources within the repo are relevant to the downstream + repos `FROM` rebuild chain + - Example: action-validator cargo build for `alice shouldi contribute` + - Some base images require this be rebuilt + - Two localhost.run subprocess with `dffml.Subproces.STDOUT_READLINE` event + - Start one 30 seconds after the other + - Every time we get issued a new URL + - For the server running which got its address changed, send an unfollow to + the `SECURITY_TXT_CONTACT_VSC_FEED` being watched for new `push` events. + - Start a new ActivityPub server for the new domain. + - Send a follow request to the `SECURITY_TXT_CONTACT_VSC_FEED` for the + new domain. + - Start `websocat` to listen for new events using websocket listener API + - Trigger rebuilds of container images using container image manifest + and `workflow_dispatch` for any containers which need to be rebuilt + due to a broadcast VSC `push` event, later for `deploy` container image + `push` events from registry. + - https://docs.github.com/developers/webhooks-and-events/webhooks/webhook-events-and-payloads#push + - https://goharbor.io/docs/1.10/working-with-projects/project-configuration/configure-webhooks/ + - If a `vsc.push` event results in a repo having something we know how to do + something about to help with (`alice shouldi contribute` -> `alice please contribute`), + then we can raise an issue or pull request as appropriate. + - If we find a vuln, log in a SCITT registry via self-noterization or otherwise and + `inReplyTo` the place the vuln exists. + - We should `inReplyTo` when we start analysis so we can watch for other replies and see + what other entities are running analysis jobs. We should then deduplicate based off + analysis (dataflow) content address. Decentralized actors should be enabled to + communicate with each other so not all running jobs drop in event of multiple + of the same launched at the same time (see IPVM caching). + - https://www.w3.org/TR/activitystreams-vocabulary/#dfn-replies + - https://www.w3.org/TR/activitystreams-vocabulary/#dfn-attachment + - Could use pinned post semantics and then inReplyTo to those, parse pinned + post content body and attachment to understand what the post is for. + Or could have two attachments, an image (screenshot as universal API). + Content is content address of manifest for attachments and own doc. +- Making some demo gifs + - https://github.com/charmbracelet/vhs#continuous-integration + +```console +$ curl -sfL https://github.com/charmbracelet/vhs/releases/download/v0.2.0/vhs_0.2.0_Linux_x86_64.tar.gz | tar xvz +LICENSE +README.md +completions/vhs.bash +completions/vhs.fish +completions/vhs.zsh +manpages/vhs.1.gz +vhs +$ echo 'Output demo.gif' > test.vhs; ./vhs record >> test.vhs +$ echo Hello World +Hello World +$ (Ctrl+D) exit +$ cat test.vhs +Output demo.gif +Sleep 500ms +Type "echo Hello" +Sleep 500ms +Type " World" +Enter +Ctrl+D +$ ./vhs < test.vhs +ttyd is not installed. Install it from: https://github.com/tsl0922/ttyd +$ ssh vhs.example.com < test.vhs > demo.gif +$ curl -sfLo ttyd https://github.com/tsl0922/ttyd/releases/download/1.7.3/ttyd.x86_64 +$ chmod 755 ttyd +$ mv ttyd ~/.local/bin/ +$ ./vhs < test.vhs +[launcher.Browser]2023/01/28 05:03:55 try to find the fastest host to download the browser binary +[launcher.Browser]2023/01/28 05:03:55 check https://storage.googleapis.com/chromium-browser-snapshots/Linux_x64/1033860/chrome-linux.zip +[launcher.Browser]2023/01/28 05:03:55 check https://registry.npmmirror.com/-/binary/chromium-browser-snapshots/Linux_x64/1033860/chrome-linux.zip +[launcher.Browser]2023/01/28 05:03:55 check https://playwright.azureedge.net/builds/chromium/1033860/chromium-linux-arm64.zip +``` + +- TODO + - [ ] Redirect CodeNarc stderr +- Future + - [ ] DIDme.me for An Image for auto conversion into screenshot YAML manifest for downstreams + - [ ] Bridge us to DWNs + - https://identity.foundation/decentralized-web-node/spec/#messages + - [ ] DWN or activitypub channel helpers + - WebRTC comms between endpoints + - DERP ad-hoc \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0161/index.md b/docs/discussions/alice_engineering_comms/0161/index.md new file mode 100644 index 0000000000..f700cfe5bb --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0161/index.md @@ -0,0 +1,3 @@ +# 2023-01-28 Engineering Logs + +- https://www.chainguard.dev/unchained/understanding-the-promise-of-vex \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0161/reply_0000.md b/docs/discussions/alice_engineering_comms/0161/reply_0000.md new file mode 100644 index 0000000000..d28928dbe7 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0161/reply_0000.md @@ -0,0 +1,17 @@ +## 2023-01-28 @pdxjohnny Engineering Logs + +- https://huggingface.co/spaces/ivelin/ui-refexp +- https://hyperonomy.com/2022/12/18/web-7-0-didcomm-agent-architecture-reference-model-didcomm-arm-0-40-december-18-2022/ + - DIDComm messaging + - DWN + - > ![8C737CB0-7E3F-44B4-AC9D-9533A20F87E4](https://user-images.githubusercontent.com/5950433/215521752-4c8adc97-5bf3-42f8-89b5-308fcc025800.jpeg) + - > ![ABDA7EB2-288D-4A13-9E3D-2C3F13C84CD9](https://user-images.githubusercontent.com/5950433/215531114-25808875-9500-4106-99df-a2a528113998.jpeg) +- https://github.com/FahimF/summarizer +- Actor discovery via notery recipt for OIDC for workflow (see recent linked spdx issue) +- https://github.com/chainguard-dev/vex + - Woohoo! + - https://www.chainguard.dev/unchained/putting-vex-to-work + - https://www.chainguard.dev/unchained/understanding-the-promise-of-vex + - https://www.chainguard.dev/unchained/reflections-on-trusting-vex-or-when-humans-can-improve-sboms +- https://spsdk.readthedocs.io/en/latest/examples/general.html +# https://github.com/CycloneDX/specification/pull/180 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0162/index.md b/docs/discussions/alice_engineering_comms/0162/index.md new file mode 100644 index 0000000000..58a6e91e74 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0162/index.md @@ -0,0 +1 @@ +# 2023-01-29 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0162/reply_0000.md b/docs/discussions/alice_engineering_comms/0162/reply_0000.md new file mode 100644 index 0000000000..0645d96adf --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0162/reply_0000.md @@ -0,0 +1,10 @@ +## 2023-01-29 @pdxjohnny Engineering Logs + +- Alice helps us see risk over time and relationship to our security lifecycle posture +- For #1247 webhook PAT for runner adding could send activitypub message with new request for runner token to SECURITY_TXT actor (or similar), then can send runner token over webrtc data channel (this avoids risk of end-to-end encrypted data being cached and broken in the future + - http://blog.printf.net/articles/2013/05/17/webrtc-without-a-signaling-server/ +- https://socialhub.activitypub.rocks/t/clarify-relation-of-socialhub-versus-fep-repository/2909 +- https://socialhub.activitypub.rocks/t/fep-c390-identity-proofs/2726 + - DID and VC alignment + - > Identity proof is a JSON document that represents a verifiable bi-directional link between a [Decentralized Identifier 1](https://www.w3.org/TR/did-core/) and an ActivityPub actor. + - https://socialhub.activitypub.rocks/t/fep-c390-identity-proofs/2726/8 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0163/index.md b/docs/discussions/alice_engineering_comms/0163/index.md new file mode 100644 index 0000000000..afb75e2206 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0163/index.md @@ -0,0 +1 @@ +# 2023-01-30 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0163/reply_0000.md b/docs/discussions/alice_engineering_comms/0163/reply_0000.md new file mode 100644 index 0000000000..712aa024a3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0163/reply_0000.md @@ -0,0 +1,540 @@ +## 2023-01-30 Engineering Logs + +- https://www.oasis-open.org/2022/11/21/new-version-of-csaf-standard/ +- Entity Analysis Trinity (EAT) - Behavioral Analysis - Telemetry + - https://docs.influxdata.com/telegraf/ + - https://collectd.org/ + - https://github.com/delimitrou/DeathStarBench + - NUMA aware topologies **TODO** link +- https://slsa.dev/provenance/v0.2#example +- https://github.com/CLIP-HPC/SlurmCommander +- https://github.com/fathyb/carbonyl + - > Carbonyl is a Chromium based browser built to run in a terminal. + - https://github.com/mholt/caddy-l4 + - https://github.com/charmbracelet/wishlist + - https://github.com/charmbracelet/wish + - https://github.com/hackerschoice/segfault + - https://github.com/intel/dffml/pull/1207#discussion_r1036680987 +- https://github.com/CycloneDX/specification/pull/180/files#diff-fae062e182d2604bfaeba757d7d099f1de3b712fa4aea687961ca92df285b39bR192 + - https://dnssecuritytxt.org/ + - > Specifies a way to contact the maintainer, supplier, or provider in the event of a security incident. Common URIs include links to a disclosure procedure, a mailto (RFC-2368) that specifies an email address, a tel (RFC-3966) that specifies a phone number, or dns (RFC-4501]) that specifies the records containing DNS Security TXT. +- https://csarven.ca/web-science-from-404-to-200#be-the-change-towards-linked-research +- https://csarven.ca/linked-research-decentralised-web + - We want Alice to carry out the scientific process + - https://linkedresearch.org/annotation/csarven.ca/%23i/87bc9a28-9f94-4b1b-a4b9-503899795f6e +- https://github.com/CycloneDX/specification/pull/180 + - Prototyping our ActivityPubsecuritytxt expansion pack + - https://mastodon.social/.well-known/webfinger?resource=acct:pdxjohnny@mastodon.social + - https://mastodon.social/@pdxjohnny/109773521704256215 + - Let's try piggybacking off one attachment, which is the activitypubsecuritytxt + - https://pdxjohnny.github.io/activitypubsecuritytxt/ + +```json +{ + "subject": "acct:pdxjohnny@mastodon.social", + "aliases": [ + "https://mastodon.social/@pdxjohnny", + "https://mastodon.social/users/pdxjohnny" + ], + "links": [ + { + "rel": "http://webfinger.net/rel/profile-page", + "type": "text/html", + "href": "https://mastodon.social/@pdxjohnny" + }, + { + "rel": "self", + "type": "application/activity+json", + "href": "https://mastodon.social/users/pdxjohnny" + }, + { + "rel": "http://ostatus.org/schema/1.0/subscribe", + "template": "https://mastodon.social/authorize_interaction?uri={uri}" + } + ] +} +``` + +- Just FYI, have been playing with the idea of using security.txt contact as an AcivityPub Actor to advertise things such as delegate Actors for various purposes. For example, list via attachments actors which publish content addresses of an orgs SBOMs This would enable leveraging ActivityPub as a means for definition and broadcast for entities delegated to various roles. We could do the same for the 3rd parties to advertise what actors are within which roles, aka are authorized to say this thing is FIPs certified. We could then attach SCITT receipts to these: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4794771 + - The SCITT registry then becomes the quick lookup path (analogously database view) to verify this. This way end users don't have to traverse the full Knowledge Graph (Activity Pub in this case). Receipt we care about for verification would be is this `inReplyTo` DAG hop path valid, aka is `did:merkle` in SCITT. + - Can have a thread linked in attachments for manifests, can discover from there + - Can watch for replies and execute jobs based off listening for manifest instances `inReplyTo` to the manifest. + - Post content addresses of manifest existing in oras.land (a container "image" registry) + - `FROM scratch` + - [Alice Engineering Comms: 2023-01-19 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4729296) + - Do we even need ActivityPub for this beyond discovery? + - Can we just use linked data? + - We probably need a bridge between the two, so that we can enable the human way of interacting + - Just finish what you started first! + - https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/advance-readings/Enhancing_DID_Privacy_through_shared_Credentials.md + - https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/did-merkle.md +- Looks like we can have four attachments, we can make one link to a post as an attachment, then replies to that to build more trees of data +- https://policymaker.disclose.io/policymaker/introduction + + +```json +{ + "@context": [ + "https://www.w3.org/ns/activitystreams", + "https://w3id.org/security/v1", + { + "manuallyApprovesFollowers": "as:manuallyApprovesFollowers", + "toot": "http://joinmastodon.org/ns#", + "featured": { + "@id": "toot:featured", + "@type": "@id" + }, + "featuredTags": { + "@id": "toot:featuredTags", + "@type": "@id" + }, + "alsoKnownAs": { + "@id": "as:alsoKnownAs", + "@type": "@id" + }, + "movedTo": { + "@id": "as:movedTo", + "@type": "@id" + }, + "schema": "http://schema.org#", + "PropertyValue": "schema:PropertyValue", + "value": "schema:value", + "discoverable": "toot:discoverable", + "Device": "toot:Device", + "Ed25519Signature": "toot:Ed25519Signature", + "Ed25519Key": "toot:Ed25519Key", + "Curve25519Key": "toot:Curve25519Key", + "EncryptedMessage": "toot:EncryptedMessage", + "publicKeyBase64": "toot:publicKeyBase64", + "deviceId": "toot:deviceId", + "claim": { + "@type": "@id", + "@id": "toot:claim" + }, + "fingerprintKey": { + "@type": "@id", + "@id": "toot:fingerprintKey" + }, + "identityKey": { + "@type": "@id", + "@id": "toot:identityKey" + }, + "devices": { + "@type": "@id", + "@id": "toot:devices" + }, + "messageFranking": "toot:messageFranking", + "messageType": "toot:messageType", + "cipherText": "toot:cipherText", + "suspended": "toot:suspended", + "Hashtag": "as:Hashtag", + "focalPoint": { + "@container": "@list", + "@id": "toot:focalPoint" + } + } + ], + "id": "https://mastodon.social/users/pdxjohnny", + "type": "Person", + "following": "https://mastodon.social/users/pdxjohnny/following", + "followers": "https://mastodon.social/users/pdxjohnny/followers", + "inbox": "https://mastodon.social/users/pdxjohnny/inbox", + "outbox": "https://mastodon.social/users/pdxjohnny/outbox", + "featured": "https://mastodon.social/users/pdxjohnny/collections/featured", + "featuredTags": "https://mastodon.social/users/pdxjohnny/collections/tags", + "preferredUsername": "pdxjohnny", + "name": "John", + "summary": "

Playing with words.

Bits and bytes of lossy streams of consciousness found here.

Humanity MUST (RFC 2119) work together!

Priority: \ud83d\uddfa\ufe0f Acceleration of happiness metric \ud83c\udde9\ud83c\uddf0\ud83d\ude01

This account != owners employer\u2019s views.

#\u02bbIMILOA #ChaoticGood

All content here from John is released into the public domain (CC0 1.0). Quote or don\u2019t quote it\u2019s not like there aren\u2019t \u267e\ufe0f Johns anyway.

Nihilist turned John 1:23 + \u2653\ufe0f

The Spirit is willing but the brain is in burnout.

Results of being CI guy

", + "url": "https://mastodon.social/@pdxjohnny", + "manuallyApprovesFollowers": false, + "discoverable": true, + "published": "2017-04-03T00:00:00Z", + "devices": "https://mastodon.social/users/pdxjohnny/collections/devices", + "publicKey": { + "id": "https://mastodon.social/users/pdxjohnny#main-key", + "owner": "https://mastodon.social/users/pdxjohnny", + "publicKeyPem": "-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAmgrgfu3yUWnCUisG7VSo\nhXXjGHjEPtW0HsdOQ/lUTflLQvBANcVAmgqNR1CxsfmlLJLy3OtLXMFUgbps+2tq\nuf1PuLvDuTVUM69NH+p/6P8GSAvpUc1Ubs/VmOyAd0EVWjh0wgT5sqAEt2wo/s1K\ndoV5j24qeWEkCaKZsvooDkq2yDOzXG2+eyq2964Wstw3zZXh7YflR6JPrTDrR2t2\nPOCBIJR2wkbtIX57TcHORziLu3kCwx7YsTboSMvp4bU0P+/2X2AgzVQRUIKcF38D\nLYG6TIe2nULu4WX1rk8kXzKyyQtiNoxFVJxgh5RB42HwCT+ikvhA8Nmv7BvJ+qNh\n5wIDAQAB\n-----END PUBLIC KEY-----\n" + }, + "tag": [ + { + "type": "Hashtag", + "href": "https://mastodon.social/tags/chaoticgood", + "name": "#chaoticgood" + }, + { + "type": "Hashtag", + "href": "https://mastodon.social/tags/%CA%BBimiloa", + "name": "#\u02bbimiloa" + } + ], + "attachment": [ + { + "type": "PropertyValue", + "name": "activitypubsecuritytxt", + "value": "https://mastodon.social/users/pdxjohnny/statuses/109323329037637680" + } + ], + "endpoints": { + "sharedInbox": "https://mastodon.social/inbox" + }, + "icon": { + "type": "Image", + "mediaType": "image/jpeg", + "url": "https://files.mastodon.social/accounts/avatars/000/032/591/original/39cca57b3d892045.jpeg" + }, + "image": { + "type": "Image", + "mediaType": "image/jpeg", + "url": "https://files.mastodon.social/accounts/headers/000/032/591/original/165f3a3436816990.jpeg" + } +} +``` + +```console +$ curl -sfL -H "Accept: application/activity+json" "https://mastodon.social/users/pdxjohnny/statuses/109323329037637680" | python3 -m json.tool +``` + +```json +{ + "@context": [ + "https://www.w3.org/ns/activitystreams", + { + "ostatus": "http://ostatus.org#", + "atomUri": "ostatus:atomUri", + "inReplyToAtomUri": "ostatus:inReplyToAtomUri", + "conversation": "ostatus:conversation", + "sensitive": "as:sensitive", + "toot": "http://joinmastodon.org/ns#", + "votersCount": "toot:votersCount", + "Hashtag": "as:Hashtag" + } + ], + "id": "https://mastodon.social/users/pdxjohnny/statuses/109323329037637680", + "type": "Note", + "summary": null, + "inReplyTo": null, + "published": "2022-11-11T04:40:17Z", + "url": "https://mastodon.social/@pdxjohnny/109323329037637680", + "attributedTo": "https://mastodon.social/users/pdxjohnny", + "to": [ + "https://www.w3.org/ns/activitystreams#Public" + ], + "cc": [ + "https://mastodon.social/users/pdxjohnny/followers" + ], + "sensitive": false, + "atomUri": "https://mastodon.social/users/pdxjohnny/statuses/109323329037637680", + "inReplyToAtomUri": null, + "conversation": "tag:mastodon.social,2022-11-11:objectId=329671901:objectType=Conversation", + "content": "

I\u2019m John. I\u2019ve fallen down the open source supply chain security rabbit hole. #introduction My current focus is around leveraging threat model and architecture information to facilitate automated context aware decentralized gamification / continuous improvement of the security lifecycle / posture of open source projects.

- https://gist.github.com/pdxjohnny/07b8c7b4a9e05579921aa3cc8aed4866
- https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/README.md#rolling-alice-volume-0-introduction-and-context

https://mastodon.social/@pdxjohnny/109320563491316354

", + "contentMap": { + "en": "

I\u2019m John. I\u2019ve fallen down the open source supply chain security rabbit hole. #introduction My current focus is around leveraging threat model and architecture information to facilitate automated context aware decentralized gamification / continuous improvement of the security lifecycle / posture of open source projects.

- https://gist.github.com/pdxjohnny/07b8c7b4a9e05579921aa3cc8aed4866
- https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/README.md#rolling-alice-volume-0-introduction-and-context

https://mastodon.social/@pdxjohnny/109320563491316354

" + }, + "updated": "2022-11-11T04:42:27Z", + "attachment": [], + "tag": [ + { + "type": "Hashtag", + "href": "https://mastodon.social/tags/introduction", + "name": "#introduction" + } + ], + "replies": { + "id": "https://mastodon.social/users/pdxjohnny/statuses/109323329037637680/replies", + "type": "Collection", + "first": { + "type": "CollectionPage", + "next": "https://mastodon.social/users/pdxjohnny/statuses/109323329037637680/replies?min_id=109323386666400103&page=true", + "partOf": "https://mastodon.social/users/pdxjohnny/statuses/109323329037637680/replies", + "items": [ + "https://mastodon.social/users/pdxjohnny/statuses/109323386666400103" + ] + } + } +} +``` + +- https://wyman.us/public/unofficial-did-method-tag.html#resolving-a-tag-did-via-email + +```console +$ rm -f db/database.sqlite3; PROTO=http HOSTNAME=localhost WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=alice ADMIN_USERNAME=alice ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run dev +$ gh webhook forward --repo=intel/dffml --events=push --url=http://localhost:8000/webhook/$(cat ../webhook) +$ curl -s http://localhost:8000/alice/outbox | python -m json.tool | python -c 'import yaml, json, sys; print(yaml.dump(json.load(sys.stdin)))' +``` + +```json +{ + "@context": "https://www.w3.org/ns/activitystreams", + "id": "http://localhost:8000/alice/outbox", + "type": "OrderedCollection", + "totalItems": 1, + "orderedItems": [ + { + "@context": "https://www.w3.org/ns/activitystreams", + "type": "Create", + "published": "2023-01-30T22:16:14.000Z", + "actor": "http://localhost:8000/alice", + "to": [ + "https://www.w3.org/ns/activitystreams#Public" + ], + "cc": [], + "object": { + "attributedTo": "http://localhost:8000/alice", + "published": "2023-01-30T22:16:14.151Z", + "to": [ + "https://www.w3.org/ns/activitystreams#Public" + ], + "cc": [ + "http://localhost:8000/alice/followers" + ], + "type": "Note", + "content": "{\"ref\":\"refs/heads/alice\",\"before\":\"8e02319e28b2f59c806e7f2a7b5ad202f51a2589\",\"after\":\"d77e2f697d806f71ab7dcf64a74cadfe5eb79598\",\"repository\":{\"id\":149512216,\"node_id\":\"MDEwOlJlcG9zaXRvcnkxNDk1MTIyMTY=\",\"name\":\"dffml\",\"full_name\":\"intel/dffml\",\"private\":false,\"owner\":{\"name\":\"intel\",\"email\":\"webadmin@linux.intel.com\",\"login\":\"intel\",\"id\":17888862,\"node_id\":\"MDEyOk9yZ2FuaXphdGlvbjE3ODg4ODYy\",\"avatar_url\":\"https://avatars.githubusercontent.com/u/17888862?v=4\",\"gravatar_id\":\"\",\"url\":\"https://api.github.com/users/intel\",\"html_url\":\"https://github.com/intel\",\"followers_url\":\"https://api.github.com/users/intel/followers\",\"following_url\":\"https://api.github.com/users/intel/following{/other_user}\",\"gists_url\":\"https://api.github.com/users/intel/gists{/gist_id}\",\"starred_url\":\"https://api.github.com/users/intel/starred{/owner}{/repo}\",\"subscriptions_url\":\"https://api.github.com/users/intel/subscriptions\",\"organizations_url\":\"https://api.github.com/users/intel/orgs\",\"repos_url\":\"https://api.github.com/users/intel/repos\",\"events_url\":\"https://api.github.com/users/intel/events{/privacy}\",\"received_events_url\":\"https://api.github.com/users/intel/received_events\",\"type\":\"Organization\",\"site_admin\":false},\"html_url\":\"https://github.com/intel/dffml\",\"description\":\"The easiest way to use Machine Learning. Mix and match underlying ML libraries and data set sources. Generate new datasets or modify existing ones with ease.\",\"fork\":false,\"url\":\"https://github.com/intel/dffml\",\"forks_url\":\"https://api.github.com/repos/intel/dffml/forks\",\"keys_url\":\"https://api.github.com/repos/intel/dffml/keys{/key_id}\",\"collaborators_url\":\"https://api.github.com/repos/intel/dffml/collaborators{/collaborator}\",\"teams_url\":\"https://api.github.com/repos/intel/dffml/teams\",\"hooks_url\":\"https://api.github.com/repos/intel/dffml/hooks\",\"issue_events_url\":\"https://api.github.com/repos/intel/dffml/issues/events{/number}\",\"events_url\":\"https://api.github.com/repos/intel/dffml/events\",\"assignees_url\":\"https://api.github.com/repos/intel/dffml/assignees{/user}\",\"branches_url\":\"https://api.github.com/repos/intel/dffml/branches{/branch}\",\"tags_url\":\"https://api.github.com/repos/intel/dffml/tags\",\"blobs_url\":\"https://api.github.com/repos/intel/dffml/git/blobs{/sha}\",\"git_tags_url\":\"https://api.github.com/repos/intel/dffml/git/tags{/sha}\",\"git_refs_url\":\"https://api.github.com/repos/intel/dffml/git/refs{/sha}\",\"trees_url\":\"https://api.github.com/repos/intel/dffml/git/trees{/sha}\",\"statuses_url\":\"https://api.github.com/repos/intel/dffml/statuses/{sha}\",\"languages_url\":\"https://api.github.com/repos/intel/dffml/languages\",\"stargazers_url\":\"https://api.github.com/repos/intel/dffml/stargazers\",\"contributors_url\":\"https://api.github.com/repos/intel/dffml/contributors\",\"subscribers_url\":\"https://api.github.com/repos/intel/dffml/subscribers\",\"subscription_url\":\"https://api.github.com/repos/intel/dffml/subscription\",\"commits_url\":\"https://api.github.com/repos/intel/dffml/commits{/sha}\",\"git_commits_url\":\"https://api.github.com/repos/intel/dffml/git/commits{/sha}\",\"comments_url\":\"https://api.github.com/repos/intel/dffml/comments{/number}\",\"issue_comment_url\":\"https://api.github.com/repos/intel/dffml/issues/comments{/number}\",\"contents_url\":\"https://api.github.com/repos/intel/dffml/contents/{+path}\",\"compare_url\":\"https://api.github.com/repos/intel/dffml/compare/{base}...{head}\",\"merges_url\":\"https://api.github.com/repos/intel/dffml/merges\",\"archive_url\":\"https://api.github.com/repos/intel/dffml/{archive_format}{/ref}\",\"downloads_url\":\"https://api.github.com/repos/intel/dffml/downloads\",\"issues_url\":\"https://api.github.com/repos/intel/dffml/issues{/number}\",\"pulls_url\":\"https://api.github.com/repos/intel/dffml/pulls{/number}\",\"milestones_url\":\"https://api.github.com/repos/intel/dffml/milestones{/number}\",\"notifications_url\":\"https://api.github.com/repos/intel/dffml/notifications{?since,all,participating}\",\"labels_url\":\"https://api.github.com/repos/intel/dffml/labels{/name}\",\"releases_url\":\"https://api.github.com/repos/intel/dffml/releases{/id}\",\"deployments_url\":\"https://api.github.com/repos/intel/dffml/deployments\",\"created_at\":1537391194,\"updated_at\":\"2023-01-17T12:33:57Z\",\"pushed_at\":1675116972,\"git_url\":\"git://github.com/intel/dffml.git\",\"ssh_url\":\"git@github.com:intel/dffml.git\",\"clone_url\":\"https://github.com/intel/dffml.git\",\"svn_url\":\"https://github.com/intel/dffml\",\"homepage\":\"https://intel.github.io/dffml/main/\",\"size\":602687,\"stargazers_count\":201,\"watchers_count\":201,\"language\":\"Python\",\"has_issues\":true,\"has_projects\":true,\"has_downloads\":true,\"has_wiki\":true,\"has_pages\":true,\"has_discussions\":true,\"forks_count\":146,\"mirror_url\":null,\"archived\":false,\"disabled\":false,\"open_issues_count\":387,\"license\":{\"key\":\"mit\",\"name\":\"MIT License\",\"spdx_id\":\"MIT\",\"url\":\"https://api.github.com/licenses/mit\",\"node_id\":\"MDc6TGljZW5zZTEz\"},\"allow_forking\":true,\"is_template\":false,\"web_commit_signoff_required\":false,\"topics\":[\"ai-inference\",\"ai-machine-learning\",\"ai-training\",\"analytics\",\"asyncio\",\"dag\",\"data-flow\",\"dataflows\",\"datasets\",\"dffml\",\"event-based\",\"flow-based-programming\",\"frameworks\",\"hyperautomation\",\"libraries\",\"machine-learning\",\"models\",\"pipelines\",\"python\",\"swrepo\"],\"visibility\":\"public\",\"forks\":146,\"open_issues\":387,\"watchers\":201,\"default_branch\":\"main\",\"stargazers\":201,\"master_branch\":\"main\",\"organization\":\"intel\"},\"pusher\":{\"name\":\"pdxjohnny\",\"email\":\"johnandersenpdx@gmail.com\"},\"organization\":{\"login\":\"intel\",\"id\":17888862,\"node_id\":\"MDEyOk9yZ2FuaXphdGlvbjE3ODg4ODYy\",\"url\":\"https://api.github.com/orgs/intel\",\"repos_url\":\"https://api.github.com/orgs/intel/repos\",\"events_url\":\"https://api.github.com/orgs/intel/events\",\"hooks_url\":\"https://api.github.com/orgs/intel/hooks\",\"issues_url\":\"https://api.github.com/orgs/intel/issues\",\"members_url\":\"https://api.github.com/orgs/intel/members{/member}\",\"public_members_url\":\"https://api.github.com/orgs/intel/public_members{/member}\",\"avatar_url\":\"https://avatars.githubusercontent.com/u/17888862?v=4\",\"description\":\"\"},\"sender\":{\"login\":\"pdxjohnny\",\"id\":5950433,\"node_id\":\"MDQ6VXNlcjU5NTA0MzM=\",\"avatar_url\":\"https://avatars.githubusercontent.com/u/5950433?v=4\",\"gravatar_id\":\"\",\"url\":\"https://api.github.com/users/pdxjohnny\",\"html_url\":\"https://github.com/pdxjohnny\",\"followers_url\":\"https://api.github.com/users/pdxjohnny/followers\",\"following_url\":\"https://api.github.com/users/pdxjohnny/following{/other_user}\",\"gists_url\":\"https://api.github.com/users/pdxjohnny/gists{/gist_id}\",\"starred_url\":\"https://api.github.com/users/pdxjohnny/starred{/owner}{/repo}\",\"subscriptions_url\":\"https://api.github.com/users/pdxjohnny/subscriptions\",\"organizations_url\":\"https://api.github.com/users/pdxjohnny/orgs\",\"repos_url\":\"https://api.github.com/users/pdxjohnny/repos\",\"events_url\":\"https://api.github.com/users/pdxjohnny/events{/privacy}\",\"received_events_url\":\"https://api.github.com/users/pdxjohnny/received_events\",\"type\":\"User\",\"site_admin\":false},\"created\":false,\"deleted\":false,\"forced\":false,\"base_ref\":null,\"compare\":\"https://github.com/intel/dffml/compare/8e02319e28b2...d77e2f697d80\",\"commits\":[{\"id\":\"d77e2f697d806f71ab7dcf64a74cadfe5eb79598\",\"tree_id\":\"e46341b7cac3e821d68a73bf199efec27625ffcd\",\"distinct\":true,\"message\":\"alice: please: log: todos: Disable overlay to grab created issue URLs which is not yet fully validated\",\"timestamp\":\"2023-01-30T14:16:12-08:00\",\"url\":\"https://github.com/intel/dffml/commit/d77e2f697d806f71ab7dcf64a74cadfe5eb79598\",\"author\":{\"name\":\"John Andersen\",\"email\":\"johnandersenpdx@gmail.com\",\"username\":\"pdxjohnny\"},\"committer\":{\"name\":\"GitHub\",\"email\":\"noreply@github.com\",\"username\":\"web-flow\"},\"added\":[],\"removed\":[],\"modified\":[\"entities/alice/entry_points.txt\"]}],\"head_commit\":{\"id\":\"d77e2f697d806f71ab7dcf64a74cadfe5eb79598\",\"tree_id\":\"e46341b7cac3e821d68a73bf199efec27625ffcd\",\"distinct\":true,\"message\":\"alice: please: log: todos: Disable overlay to grab created issue URLs which is not yet fully validated\",\"timestamp\":\"2023-01-30T14:16:12-08:00\",\"url\":\"https://github.com/intel/dffml/commit/d77e2f697d806f71ab7dcf64a74cadfe5eb79598\",\"author\":{\"name\":\"John Andersen\",\"email\":\"johnandersenpdx@gmail.com\",\"username\":\"pdxjohnny\"},\"committer\":{\"name\":\"GitHub\",\"email\":\"noreply@github.com\",\"username\":\"web-flow\"},\"added\":[],\"removed\":[],\"modified\":[\"entities/alice/entry_points.txt\"]}}", + "id": "http://localhost:8000/alice/posts/9a1d1dff-f25e-47a3-ac01-09e1f2e25ccd" + }, + "id": "http://localhost:8000/alice/posts/155bb1d0-e74b-4995-892a-aaa472e25b3f" + } + ] +} +``` + +- Try loading content + +```console +$ curl -s http://localhost:8000/alice/outbox | jq --unbuffered -r '.orderedItems[].object.content' | jq +``` + +```json +{ + "ref": "refs/heads/alice", + "before": "8e02319e28b2f59c806e7f2a7b5ad202f51a2589", + "after": "d77e2f697d806f71ab7dcf64a74cadfe5eb79598", + "repository": { + "id": 149512216, + "node_id": "MDEwOlJlcG9zaXRvcnkxNDk1MTIyMTY=", + "name": "dffml", + "full_name": "intel/dffml", + "private": false, + "owner": { + "name": "intel", + "email": "webadmin@linux.intel.com", + "login": "intel", + "id": 17888862, + "node_id": "MDEyOk9yZ2FuaXphdGlvbjE3ODg4ODYy", + "avatar_url": "https://avatars.githubusercontent.com/u/17888862?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/intel", + "html_url": "https://github.com/intel", + "followers_url": "https://api.github.com/users/intel/followers", + "following_url": "https://api.github.com/users/intel/following{/other_user}", + "gists_url": "https://api.github.com/users/intel/gists{/gist_id}", + "starred_url": "https://api.github.com/users/intel/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/intel/subscriptions", + "organizations_url": "https://api.github.com/users/intel/orgs", + "repos_url": "https://api.github.com/users/intel/repos", + "events_url": "https://api.github.com/users/intel/events{/privacy}", + "received_events_url": "https://api.github.com/users/intel/received_events", + "type": "Organization", + "site_admin": false + }, + "html_url": "https://github.com/intel/dffml", + "description": "The easiest way to use Machine Learning. Mix and match underlying ML libraries and data set sources. Generate new datasets or modify existing ones with ease.", + "fork": false, + "url": "https://github.com/intel/dffml", + "forks_url": "https://api.github.com/repos/intel/dffml/forks", + "keys_url": "https://api.github.com/repos/intel/dffml/keys{/key_id}", + "collaborators_url": "https://api.github.com/repos/intel/dffml/collaborators{/collaborator}", + "teams_url": "https://api.github.com/repos/intel/dffml/teams", + "hooks_url": "https://api.github.com/repos/intel/dffml/hooks", + "issue_events_url": "https://api.github.com/repos/intel/dffml/issues/events{/number}", + "events_url": "https://api.github.com/repos/intel/dffml/events", + "assignees_url": "https://api.github.com/repos/intel/dffml/assignees{/user}", + "branches_url": "https://api.github.com/repos/intel/dffml/branches{/branch}", + "tags_url": "https://api.github.com/repos/intel/dffml/tags", + "blobs_url": "https://api.github.com/repos/intel/dffml/git/blobs{/sha}", + "git_tags_url": "https://api.github.com/repos/intel/dffml/git/tags{/sha}", + "git_refs_url": "https://api.github.com/repos/intel/dffml/git/refs{/sha}", + "trees_url": "https://api.github.com/repos/intel/dffml/git/trees{/sha}", + "statuses_url": "https://api.github.com/repos/intel/dffml/statuses/{sha}", + "languages_url": "https://api.github.com/repos/intel/dffml/languages", + "stargazers_url": "https://api.github.com/repos/intel/dffml/stargazers", + "contributors_url": "https://api.github.com/repos/intel/dffml/contributors", + "subscribers_url": "https://api.github.com/repos/intel/dffml/subscribers", + "subscription_url": "https://api.github.com/repos/intel/dffml/subscription", + "commits_url": "https://api.github.com/repos/intel/dffml/commits{/sha}", + "git_commits_url": "https://api.github.com/repos/intel/dffml/git/commits{/sha}", + "comments_url": "https://api.github.com/repos/intel/dffml/comments{/number}", + "issue_comment_url": "https://api.github.com/repos/intel/dffml/issues/comments{/number}", + "contents_url": "https://api.github.com/repos/intel/dffml/contents/{+path}", + "compare_url": "https://api.github.com/repos/intel/dffml/compare/{base}...{head}", + "merges_url": "https://api.github.com/repos/intel/dffml/merges", + "archive_url": "https://api.github.com/repos/intel/dffml/{archive_format}{/ref}", + "downloads_url": "https://api.github.com/repos/intel/dffml/downloads", + "issues_url": "https://api.github.com/repos/intel/dffml/issues{/number}", + "pulls_url": "https://api.github.com/repos/intel/dffml/pulls{/number}", + "milestones_url": "https://api.github.com/repos/intel/dffml/milestones{/number}", + "notifications_url": "https://api.github.com/repos/intel/dffml/notifications{?since,all,participating}", + "labels_url": "https://api.github.com/repos/intel/dffml/labels{/name}", + "releases_url": "https://api.github.com/repos/intel/dffml/releases{/id}", + "deployments_url": "https://api.github.com/repos/intel/dffml/deployments", + "created_at": 1537391194, + "updated_at": "2023-01-17T12:33:57Z", + "pushed_at": 1675116972, + "git_url": "git://github.com/intel/dffml.git", + "ssh_url": "git@github.com:intel/dffml.git", + "clone_url": "https://github.com/intel/dffml.git", + "svn_url": "https://github.com/intel/dffml", + "homepage": "https://intel.github.io/dffml/main/", + "size": 602687, + "stargazers_count": 201, + "watchers_count": 201, + "language": "Python", + "has_issues": true, + "has_projects": true, + "has_downloads": true, + "has_wiki": true, + "has_pages": true, + "has_discussions": true, + "forks_count": 146, + "mirror_url": null, + "archived": false, + "disabled": false, + "open_issues_count": 387, + "license": { + "key": "mit", + "name": "MIT License", + "spdx_id": "MIT", + "url": "https://api.github.com/licenses/mit", + "node_id": "MDc6TGljZW5zZTEz" + }, + "allow_forking": true, + "is_template": false, + "web_commit_signoff_required": false, + "topics": [ + "ai-inference", + "ai-machine-learning", + "ai-training", + "analytics", + "asyncio", + "dag", + "data-flow", + "dataflows", + "datasets", + "dffml", + "event-based", + "flow-based-programming", + "frameworks", + "hyperautomation", + "libraries", + "machine-learning", + "models", + "pipelines", + "python", + "swrepo" + ], + "visibility": "public", + "forks": 146, + "open_issues": 387, + "watchers": 201, + "default_branch": "main", + "stargazers": 201, + "master_branch": "main", + "organization": "intel" + }, + "pusher": { + "name": "pdxjohnny", + "email": "johnandersenpdx@gmail.com" + }, + "organization": { + "login": "intel", + "id": 17888862, + "node_id": "MDEyOk9yZ2FuaXphdGlvbjE3ODg4ODYy", + "url": "https://api.github.com/orgs/intel", + "repos_url": "https://api.github.com/orgs/intel/repos", + "events_url": "https://api.github.com/orgs/intel/events", + "hooks_url": "https://api.github.com/orgs/intel/hooks", + "issues_url": "https://api.github.com/orgs/intel/issues", + "members_url": "https://api.github.com/orgs/intel/members{/member}", + "public_members_url": "https://api.github.com/orgs/intel/public_members{/member}", + "avatar_url": "https://avatars.githubusercontent.com/u/17888862?v=4", + "description": "" + }, + "sender": { + "login": "pdxjohnny", + "id": 5950433, + "node_id": "MDQ6VXNlcjU5NTA0MzM=", + "avatar_url": "https://avatars.githubusercontent.com/u/5950433?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/pdxjohnny", + "html_url": "https://github.com/pdxjohnny", + "followers_url": "https://api.github.com/users/pdxjohnny/followers", + "following_url": "https://api.github.com/users/pdxjohnny/following{/other_user}", + "gists_url": "https://api.github.com/users/pdxjohnny/gists{/gist_id}", + "starred_url": "https://api.github.com/users/pdxjohnny/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/pdxjohnny/subscriptions", + "organizations_url": "https://api.github.com/users/pdxjohnny/orgs", + "repos_url": "https://api.github.com/users/pdxjohnny/repos", + "events_url": "https://api.github.com/users/pdxjohnny/events{/privacy}", + "received_events_url": "https://api.github.com/users/pdxjohnny/received_events", + "type": "User", + "site_admin": false + }, + "created": false, + "deleted": false, + "forced": false, + "base_ref": null, + "compare": "https://github.com/intel/dffml/compare/8e02319e28b2...d77e2f697d80", + "commits": [ + { + "id": "d77e2f697d806f71ab7dcf64a74cadfe5eb79598", + "tree_id": "e46341b7cac3e821d68a73bf199efec27625ffcd", + "distinct": true, + "message": "alice: please: log: todos: Disable overlay to grab created issue URLs which is not yet fully validated", + "timestamp": "2023-01-30T14:16:12-08:00", + "url": "https://github.com/intel/dffml/commit/d77e2f697d806f71ab7dcf64a74cadfe5eb79598", + "author": { + "name": "John Andersen", + "email": "johnandersenpdx@gmail.com", + "username": "pdxjohnny" + }, + "committer": { + "name": "GitHub", + "email": "noreply@github.com", + "username": "web-flow" + }, + "added": [], + "removed": [], + "modified": [ + "entities/alice/entry_points.txt" + ] + } + ], + "head_commit": { + "id": "d77e2f697d806f71ab7dcf64a74cadfe5eb79598", + "tree_id": "e46341b7cac3e821d68a73bf199efec27625ffcd", + "distinct": true, + "message": "alice: please: log: todos: Disable overlay to grab created issue URLs which is not yet fully validated", + "timestamp": "2023-01-30T14:16:12-08:00", + "url": "https://github.com/intel/dffml/commit/d77e2f697d806f71ab7dcf64a74cadfe5eb79598", + "author": { + "name": "John Andersen", + "email": "johnandersenpdx@gmail.com", + "username": "pdxjohnny" + }, + "committer": { + "name": "GitHub", + "email": "noreply@github.com", + "username": "web-flow" + }, + "added": [], + "removed": [], + "modified": [ + "entities/alice/entry_points.txt" + ] + } +} +``` + +- Okay, we now have the basis for federated downstream validation +- TODO + - [ ] GitOps allowlist with priority for not AcivityPub fail-to-ban style + - [ ] Watchers which just add to knowledge graph \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0163/reply_0001.md b/docs/discussions/alice_engineering_comms/0163/reply_0001.md new file mode 100644 index 0000000000..611b3f28c7 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0163/reply_0001.md @@ -0,0 +1,7 @@ +## 2023-01-30 IETF SCITT + +- Ned's here +- Consumer of released software wanting to understand how to setup their trust chains + - Roy: Executive order requires that we be able to do self endorsement *and* third party +- Claim which binds supply to signing authority +- Talked about 3rd party trust attestation roles, this is what we are prototyping with ActivityPub \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0164/index.md b/docs/discussions/alice_engineering_comms/0164/index.md new file mode 100644 index 0000000000..b3f0e93040 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0164/index.md @@ -0,0 +1,5 @@ +# 2023-01-31 Engineering Logs + +- https://www.chainguard.dev/unchained/accelerate-vex-adoption-through-openvex + - /acc/ 🛤️ +- https://datasette.io/plugins/datasette-dashboards#user-content-usage \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0164/reply_0000.md b/docs/discussions/alice_engineering_comms/0164/reply_0000.md new file mode 100644 index 0000000000..22af5aedb5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0164/reply_0000.md @@ -0,0 +1,44 @@ +## 2023-01-31 @pdxjohnny Engineering Logs + +- Release of OpenVEX! Chaos smiles on us again :) + - https://mastodon.social/@ariadne@treehouse.systems/109784681116604896 + - > meanwhile at work, a thing i've been working on for the past few months has dropped: https://www.chainguard.dev/unchained/accelerate-vex-adoption-through-openvex it's basically like ActivityStreams, but for security vulnerability data sharing. with a little bit of work, we can lift up to something more like ActivityPub for real-time collaboration, a blog is forthcoming about it. + - https://github.com/openvex/spec/blob/main/ATTESTING.md#digital-signatures + - https://github.com/pdxjohnny/activitypubsecuritytxt/commit/9a68cb0b752126046157b047cb72563228c078de + - https://github.com/pdxjohnny/activitypubsecuritytxt/commit/1e35f549a33347918335e89200055841b267e86c + - https://github.com/openvex/spec/blob/main/OPENVEX-SPEC.md#openvex-and-json-ld + +![chaos_for_the_chaos_God](https://user-images.githubusercontent.com/5950433/215828966-0f91a8fe-0809-4523-9202-b09fd5f635d9.jpg) + +- https://github.com/fuzhibo/jekyll-mermaid-diagrams/blob/b5e0c37486dec1c840d6e8a47c92a754af3cfd72/lib/jekyll-mermaid-diagrams.rb#L14-L15 +- https://hachyderm.io/@holly_cummins/109636163544669034 + - > TIL there's a technical name for why ideas happen in the shower: the "default mode network" is a pattern of brain activity, measurable using fMRI, that happens when we're unfocussed. When the brain goes into idle mode (reduced activity), this part of the brain actually becomes *more* active. What does the default mode network do? Research is ongoing, but part of it definitely seems to be making connections, which is associated with *curiosity and creativity*. More here: [https://www.nationalgeographic.co.uk/histo](https://www.nationalgeographic.co.uk/history-and-civilisation/2022/08/the-science-of-why-you-have-great-ideas-in-the-shower) + - grep rhe system requires excersize + - Chaos metric +- A wild manifest appears! + - https://github.com/openvex/vexctl#3-vexing-a-results-set + - https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md + +![image](https://user-images.githubusercontent.com/5950433/215843365-9a03f49f-2607-4e48-acd0-21269814427d.png) + +- https://github.com/microsoft/GODEL + - http://ndjson.org/ + - `--train_file` + - https://gist.github.com/pdxjohnny/016f8d9edcb65f62c3fbe4b019299ef7 + - https://colab.research.google.com/gist/pdxjohnny/09a125f58151b5099cbff02b27a80abb/finetunegodel.ipynb + - https://til.simonwillison.net/python/gtr-t5-large + - https://ipython.readthedocs.io/en/stable/interactive/magics.html + - https://ipython.readthedocs.io/en/stable/interactive/magics.html#cell-magics + - https://github.com/ipython/ipython/issues/13376 +- https://slsa.dev/spec/v0.1/levels +- https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?owner=PacifiCorp + - Inventory + - #1207 + - https://lite.datasette.io/ + - https://docs.datasette.io/en/stable/getting_started.html#using-datasette-on-your-own-computer + - sqlite to endpoint + - Could maybe do linked data? + - Could we go from CVE Bin Tool database (`--nolock`) to OpenVEX via a plugin for datasette? + - Could we loop againt the db with nolock to publish events during scan from seperate process? Would have to do db writes more often? + - https://www.sqlite.org/wal.html + - https://datasette.io/plugins/datasette-scraper#user-content-usage-notes \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0165/index.md b/docs/discussions/alice_engineering_comms/0165/index.md new file mode 100644 index 0000000000..3ffbd911a1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0165/index.md @@ -0,0 +1 @@ +# 2023-02-01 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0165/reply_0000.md b/docs/discussions/alice_engineering_comms/0165/reply_0000.md new file mode 100644 index 0000000000..7b117d8c63 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0165/reply_0000.md @@ -0,0 +1,31 @@ +## 2023-02-01 @pdxjohnny Engineering Logs + +- https://youtu.be/AFmCv8cfkm0 + - DIDComm explained + - First principles + - Had heard this before offhand sounds somehow relevant +- https://github.com/Deuchnord/f2ap +- https://fleuret.org/cgi-bin/gitweb/gitweb.cgi?p=mygpt.git;a=blob;f=picoclvr.py;h=fb791fefd76b8fcec8613a71415fd762add3990f;hb=199f3195388af8be1f3e50dec343964f73fc0e6d + - Public domain GPT + - bridge to ActivityPub stream for vcs.push? +- https://fleuret.org/cgi-bin/gitweb/gitweb.cgi?p=mygpt.git;a=tree;h=f2fb5261309941f1f017b6d7612ff01843300de6;hb=199f3195388af8be1f3e50dec343964f73fc0e6d +- https://fleuret.org/cgi-bin/gitweb/gitweb.cgi?p=pytorch.git +- https://fleuret.org/cgi-bin/gitweb/gitweb.cgi?p=pytorch.git;a=blob_plain;f=minidiffusion.py;hb=HEAD + - Public domain implementation of stable diffusion + - ref: autoencoder? +- https://simonwillison.net/2020/Nov/28/datasette-ripgrep/ +- https://json.blinry.org/#https://prophecy.chadig.com/alice/outbox +- activitypubsecuritytxt aka Manifest Transport ADR aka for `Rolling Alice: Architecting Alice: Transport Acquisition` +- Manifest ADRs (README + schema) allow us to do English language similarity on intent descriptions + - Can do inference from codebase to Manifest ADRs, then similarity analysis +- https://github.com/pdxjohnny/autoentrypoint + - `README.rst` + +```console +$ sphinx-quickstart --no-sep --no-makefile --no-batchfile \ + --language english -v 0.0.0 --release 0.0.0 \ + --project "My Project" --author "First Last" docs/ +``` + +- TODO + - [ ] Finish part 2 of web 7 overview https://youtu.be/1XnPWmpkGro \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0166/index.md b/docs/discussions/alice_engineering_comms/0166/index.md new file mode 100644 index 0000000000..4c0b651cdc --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0166/index.md @@ -0,0 +1 @@ +# 2023-02-02 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0166/reply_0000.md b/docs/discussions/alice_engineering_comms/0166/reply_0000.md new file mode 100644 index 0000000000..7e9c10f792 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0166/reply_0000.md @@ -0,0 +1,31 @@ +## 2023-02-02 @pdxjohnny Engineering Logs + +- Don't worry about DIDs, you can put ActivityPub over DID methods instead of HTTP later +- https://cdk8s.io/ +- https://github.com/permitio/OPToggles +- https://github.com/chef/automate/tree/master/components/authz-service#authz-with-opa +- https://docs.github.com/en/rest/git?apiVersion=2022-11-28#about-git-database + - https://docs.github.com/en/rest/guides/using-the-rest-api-to-interact-with-your-git-database?apiVersion=2022-11-28 + - DID/ActivityPub/ATP analogy API? +- https://wijmans.xyz/publication/eom/ + - Vol 2: Cartography +- https://github.com/asciinema/asciinema-server/wiki/Installation-guide + - Okay fuck yes, just closed the loop, then we bdrige this to activitypub, stream of consiousness is a go + - Daniel didn't reply about using DWNs, and DID Comm agents look good as a next step there. +- ssh git push to deploy anything + - proxy does translation into take push as single commit of dir (could piggyback off pgp or cowign or other commit signing as well) + - +- open architecture + - context-to-context analysis + - Analysis based on `Input.origin`, ensure `Input` flow through operations ensures validation for + - workflow-to-workflow + - workflow-to-job + - job-to-job + - Artifacts + - job-to-action +- Investigate GitHub approved workflows per env + - https://docs.github.com/en/actions/managing-workflow-runs/reviewing-deployments#about-required-reviews-in-workflows +- https://git-scm.com/docs/git-filter-branch#_examples +- Similar to parse_ast.py Python ast example, export all groovy functions to `features` +- https://github.com/intel/dffml/pull/1061#discussion_r1095079133 + - We don't need to nessicarily update status checks via API, can just have a pipeline within PR workflows which says this other PR must be merged in an upstrema or downstrema before this one can auto merge \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0166/reply_0001.md b/docs/discussions/alice_engineering_comms/0166/reply_0001.md new file mode 100644 index 0000000000..0884cda59d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0166/reply_0001.md @@ -0,0 +1,58 @@ +## 2023-02-02 Exporting Groovy Functions + +- 1:1 Pankaj/John +- Update 2023-02-15: This became https://github.com/intel/dffml/commit/15c9c245add1fae5a0b1767ed77973d9dbdd4899 +- https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst#writing-an-overlay +- https://docs.groovy-lang.org/latest/html/api/org/apache/groovy/parser/antlr4/package-summary.html + - https://docs.groovy-lang.org/latest/html/api/org/apache/groovy/parser/antlr4/GroovyLangParser.html + - https://www.graalvm.org/ + - https://www.graalvm.org/latest/reference-manual/native-image/ +- First we create another operation which takes groovy files + - Define `NewType("GroovyFunction", str)` as output + - Remove `output=Stage.OUTPUT` + +https://github.com/intel/dffml/blob/d77e2f697d806f71ab7dcf64a74cadfe5eb79598/entities/alice/alice/shouldi/contribute/cicd.py#L26-L33 + +- Then we do the Groovy equivalent of returning a list of functions (seen here in python AST example) + +https://github.com/intel/dffml/blob/d77e2f697d806f71ab7dcf64a74cadfe5eb79598/examples/operations/python.py#L61-L66 + +```patch +diff --git a/entities/alice/alice/shouldi/contribute/cicd.py b/entities/alice/alice/shouldi/contribute/cicd.py +index 3237a1990..e682e3aeb 100644 +--- a/entities/alice/alice/shouldi/contribute/cicd.py ++++ b/entities/alice/alice/shouldi/contribute/cicd.py +@@ -33,6 +33,32 @@ def cicd_jenkins_library( + return bool(groovy_file_paths) + + ++GroovyFunction = NewType("GroovyFunction", str) ++ ++@dffml.op ++def groovy_functions( ++ groovy_file_paths: dffml_operations_innersource.operations.GroovyFileWorkflowUnixStylePath, ++) -> List[GroovyFunction]: ++ # TODO Probably need to require namspacing of functions somehow ++ # Might need to update the stdlib qualifications spec ++ """ ++ ++ ++ groovy_file_url = "https://github.com/apache/groovy/raw/74baecf4b3990f84003929c0c31ec150d5d305cf/src/test/groovy/transform/stc/DelegatesToSTCTest.groovy" ++ $ wget https://github.com/apache/groovy/raw/74baecf4b3990f84003929c0c31ec150d5d305cf/src/test/groovy/transform/stc/DelegatesToSTCTest.groovy ++ $ GROOVY_FILE=DelegatesToSTCTest.groovy python -um doctest path/to/this/file.py ++ ++ >>> import os ++ >>> groovy_functions(os.environ["GROOVY_FILE"]) ++ ["testShouldChooseMethodFromOwner", "testShouldChooseMethodFromDelegate", ""] ++ TODO List rest of funtion names or choose samller file ++ """ ++ # Example: ++ # void testShouldChooseMethodFromOwner() { ++ # yield line if line.strip().endswith(") {") and not "=" in line. ++ return [] ++ ++ + @dffml.op( + stage=dffml.Stage.OUTPUT, + ) +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0167/index.md b/docs/discussions/alice_engineering_comms/0167/index.md new file mode 100644 index 0000000000..0eb955784c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0167/index.md @@ -0,0 +1 @@ +# 2023-02-03 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0167/reply_0000.md b/docs/discussions/alice_engineering_comms/0167/reply_0000.md new file mode 100644 index 0000000000..b6a4cc214d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0167/reply_0000.md @@ -0,0 +1,594 @@ +## 2023-02-03 @pdxjohnny Engineering Logs + +- https://github.com/GoogleContainerTools/kaniko/issues/1836#issuecomment-1416451403 +- https://cdk8s.io/docs/latest/getting-started/#abstraction-through-constructs + - https://github.com/cdk8s-team/cdk8s/tree/master/examples/python/crd + - https://github.com/cdk8s-team/cdk8s/blob/master/examples/python/crd/cdk8s.yaml + - https://github.com/cdk8s-team/cdk8s/tree/master/examples/python/web-service + - https://github.com/cdk8s-team/cdk8s/tree/master/examples/python/hello + - https://cdk8s.io/docs/latest/examples/ +- https://til.simonwillison.net/webassembly/python-in-a-wasm-sandbox +- https://github.com/slsa-framework/slsa-github-generator/tree/main/internal/builders/container +- `alice please show me how to ...` + - Creates diff, `alice please contrbiute ...` creates pull requests (or ActivityPub analogies) +- https://github.blog/2023-02-02-enabling-branch-deployments-through-issueops-with-github-actions/ + - Chaos smiles on us again + - #1061 + - This is how we enable prospective 2nd party plugin maintainers to check for increase in support level (from 3rd to 2nd party) + - This also allows them to have Alice create automated pull requests which resolve issues for them to increase their support level + - #1239 + - Example reproduced below + +```yaml +name: "branch deploy demo" + +# The workflow will execute on new comments on pull requests - example: ".deploy" as a comment +on: + issue_comment: + types: [created] + +jobs: + demo: + if: ${{ github.event.issue.pull_request }} # only run on pull request comments (no need to run on issue comments) + runs-on: ubuntu-latest + steps: + # Execute IssueOps branch deployment logic, hooray! + # This will be used to "gate" all future steps below and conditionally trigger steps/deployments + - uses: github/branch-deploy@vX.X.X # replace X.X.X with the version you want to use + id: branch-deploy # it is critical you have an id here so you can reference the outputs of this step + with: + trigger: ".deploy" # the trigger phrase to look for in the comment on the pull request + + # Run your deployment logic for your project here - examples seen below + + # Checkout your project repository based on the ref provided by the branch-deploy step + - uses: actions/checkout@3.0.2 + if: ${{ steps.branch-deploy.outputs.continue == 'true' }} # skips if the trigger phrase is not found + with: + ref: ${{ steps.branch-deploy.outputs.ref }} # uses the detected branch from the branch-deploy step + + # Do some fake "noop" deployment logic here + # conditionally run a noop deployment + - name: fake noop deploy + if: ${{ steps.branch-deploy.outputs.continue == 'true' && steps.branch-deploy.outputs.noop == 'true' }} # only run if the trigger phrase is found and the branch-deploy step detected a noop deployment + run: echo "I am doing a fake noop deploy" + + # Do some fake "regular" deployment logic here + # conditionally run a regular deployment + - name: fake regular deploy + if: ${{ steps.branch-deploy.outputs.continue == 'true' && steps.branch-deploy.outputs.noop != 'true' }} # only run if the trigger phrase is found and the branch-deploy step detected a regular deployment + run: echo "I am doing a fake regular deploy" +``` + +![chaos_for_the_chaos_God](https://user-images.githubusercontent.com/5950433/216681621-b55b5c88-5fa3-4bde-802a-e7d569517eb7.jpg) + +- https://edu.chainguard.dev/open-source/sbom/what-is-openvex/ +- https://github.com/namshi/docker-smtp + - This is an SMTP server and SMTP relay server 🛤️ +- `ActivityPubStarterAdminInputNetwork` + - Websocket endpoint to receive new events + - POST `/admin/create` + - `Input.id` as activitypub URL, later backup to DID land +- https://about.sourcegraph.com/blog/building-conc-better-structured-concurrency-for-go +- https://www2023.thewebconf.org/calls/webdeveloper-w3c/ +- https://github.com/jart/blink +- Fast transform helper `@op` derivative decorators (we'd looked at this recently) + - Helps with remapping datatypes, could be used in input type declaration + - https://intel.github.io/dffml/main/api/operation/mapping.html?highlight=mapping_extract_value#dffml.operation.mapping.mapping_extract_value + - Alternative sketch: `@op.apply(mapping_extract_value, ... something else? ...)` + +```python +def takes_repo_dir( + repo_directory: op_mapping_extract_value(AliceGitRepo.directory), + *, + logger: logging.Logger, + env: dict, +) -> : + if logger: + logger.debug(f"{repo_directory} logged! (already logged if orchestrator input called, ex: GitHub Action DEBUGing enabled)") +``` + +- https://github.com/OpenLineage/OpenLineage/issues/1412 + - Was just trying to figure out how to do this with webtorrent and activitypub this morning, oh! Chaos smiles again! :) :) +- https://github.com/OpenLineage/OpenLineage/releases/tag/0.19.2 + - Grouped by category + - https://github.com/OpenLineage/OpenLineage/pull/1432/files#diff-c28f070ad0fa67a71f138b6c4b1302bfa0640bad2a44f1ca847b6170080d14fb + - https://github.com/OpenLineage/OpenLineage/tree/main/integration + - https://github.com/OpenLineage/OpenLineage/tree/main/integration/sql + - https://github.com/intel/dffml/tree/main/source/mysql + - Just use mermaid + - Flat files, markdown docs +- Fixing webhook `vcs.push` to ActivityPub + +```bash +npm run build +rm -i db/database.sqlite3 +head -n 10000 /dev/urandom | sha384sum | awk '{print $1}' | tee ../webhook +head -n 10000 /dev/urandom | sha384sum | awk '{print $1}' | tee ../password +openssl genrsa -out keypair.pem 4096 && openssl rsa -in keypair.pem -pubout -out publickey.crt && openssl pkcs8 -topk8 -inform PEM -outform PEM -nocrypt -in keypair.pem -out pkcs8.key +FDQN=vcs.activitypub.securitytxt.dffml.chadig.com WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=push ADMIN_USERNAME=admin ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run start & +caddy reverse-proxy --from https://vcs.activitypub.securitytxt.dffml.chadig.com --to :8000 +``` + +- Ensure webhook delivery for the following events + - Related + - https://github.com/intel/dffml/pull/1061#discussion_r1095079133 + - **TODO** Alice using GH cli to do this + - https://github.com/intel/dffml/pull/1061#discussion_r819930461 + - https://github.com/intel/dffml/pull/1207#discussion_r1036680987 + - > Alice is you. What do you have access too? + - Workflow jobs + - Workflow job queued, waiting, in progress, or completed on a repository. + - Workflow runs + - Workflow run requested or completed on a repository. + - Statuses + - Commit status updated from the API. + - Pushes + - Git push to a repository. + - Deployment statuses + - Deployment status updated from the API. + - Check suites + - Check suite is requested, rerequested, or completed. + - Check runs + - Check run is created, requested, rerequested, or completed. + - Branch or tag creation + - Branch or tag created. + - Commit comments + - Commit or diff commented on. + - Discussions + - Discussion created, edited, pinned, unpinned, locked, unlocked, transferred, answered, unanswered, labeled, unlabeled, had its category changed, or was deleted. + - Issues + - Issue opened, edited, deleted, transferred, pinned, unpinned, closed, reopened, assigned, unassigned, labeled, unlabeled, milestoned, demilestoned, locked, or unlocked. + - Issue comments + - Issue comment created, edited, or deleted. + - Packages + - GitHub Packages published or updated in a repository. + - Milestones + - Milestone created, closed, opened, edited, or deleted. + - Page builds + - Pages site built. + - Pull request review comments + - Pull request diff comment created, edited, or deleted. + - Pull request review threads + - A pull request review thread was resolved or unresolved. + - Pull request reviews + - Pull request review submitted, edited, or dismissed. + - Pull requests + - Pull request assigned, auto merge disabled, auto merge enabled, closed, converted to draft, demilestoned, dequeued, edited, enqueued, labeled, locked, milestoned, opened, ready for review, reopened, review request removed, review requested, synchronized, unassigned, unlabeled, or unlocked. + - Pushes + - Git push to a repository. + - Releases + - Release created, edited, published, unpublished, or deleted. +- Retrigger webhook delivery + +![image](https://user-images.githubusercontent.com/5950433/216702932-365a8ed4-a949-4113-8d86-8e03181b532e.png) + +```console +$ curl -sfL https://vcs.activitypub.securitytxt.dffml.chadig.com/push/outbox | jq --unbuffered -r '.orderedItems[].object.content' | jq | python -c 'import yaml, json, sys; print(yaml.dump(json.load(sys.stdin)))' +``` + +- This is an example of a check suite completion, yesterday we touched on how 2nd party PRs could have interdependency via jobs which watch for `ActivityPub` events such as the `check_suite` example we see here. + - Was trying to figure out the webtorrent thing in case there were sets of events that we wanted to watch, and the torrent magnet link could but the content address of the set, but that will probably be solved by DID resolution of ActivityPub objects later. + +```yaml +action: completed +check_suite: + after: ddb32a4e65b0d79c7561ce2bdde16d963c8abde1 + app: + created_at: 2018-07-30T09:30:17Z + description: Automate your workflow from idea to production + events: + - branch_protection_rule + - check_run + - check_suite + - create + - delete + - deployment + - deployment_status + - discussion + - discussion_comment + - fork + - gollum + - issues + - issue_comment + - label + - merge_group + - milestone + - page_build + - project + - project_card + - project_column + - public + - pull_request + - pull_request_review + - pull_request_review_comment + - push + - registry_package + - release + - repository + - repository_dispatch + - status + - watch + - workflow_dispatch + - workflow_run + external_url: https://help.github.com/en/actions + html_url: https://github.com/apps/github-actions + id: 15368 + name: GitHub Actions + node_id: MDM6QXBwMTUzNjg= + owner: + avatar_url: https://avatars.githubusercontent.com/u/9919?v=4 + events_url: https://api.github.com/users/github/events{/privacy} + followers_url: https://api.github.com/users/github/followers + following_url: https://api.github.com/users/github/following{/other_user} + gists_url: https://api.github.com/users/github/gists{/gist_id} + gravatar_id: "" + html_url: https://github.com/github + id: 9919 + login: github + node_id: MDEyOk9yZ2FuaXphdGlvbjk5MTk= + organizations_url: https://api.github.com/users/github/orgs + received_events_url: https://api.github.com/users/github/received_events + repos_url: https://api.github.com/users/github/repos + site_admin: false + starred_url: https://api.github.com/users/github/starred{/owner}{/repo} + subscriptions_url: https://api.github.com/users/github/subscriptions + type: Organization + url: https://api.github.com/users/github + permissions: + actions: write + administration: read + checks: write + contents: write + deployments: write + discussions: write + issues: write + merge_queues: write + metadata: read + packages: write + pages: write + pull_requests: write + repository_hooks: write + repository_projects: write + security_events: write + statuses: write + vulnerability_alerts: read + slug: github-actions + updated_at: 2019-12-10T19:04:12Z + before: a6ec904d3b319de1fcb25bf6f724fd70dc057884 + check_runs_url: https://api.github.com/repos/intel/dffml/check-suites/10754865120/check-runs + conclusion: success + created_at: 2023-02-03T06:01:42Z + head_branch: main + head_commit: + author: + email: johnandersenpdx@gmail.com + name: John Andersen + committer: + email: noreply@github.com + name: GitHub + id: ddb32a4e65b0d79c7561ce2bdde16d963c8abde1 + message: "ci: dispatch: build; images; container: Fixup manifest if bad line + endings" + timestamp: 2023-01-16T19:10:53Z + tree_id: 2d5e1a8c29d57406ee4302482db455addc6bc224 + head_sha: ddb32a4e65b0d79c7561ce2bdde16d963c8abde1 + id: 10754865120 + latest_check_runs_count: 1 + node_id: CS_kwDOCOlgGM8AAAACgQo34A + pull_requests: [] + rerequestable: true + runs_rerequestable: false + status: completed + updated_at: 2023-02-03T06:01:59Z + url: https://api.github.com/repos/intel/dffml/check-suites/10754865120 +organization: + avatar_url: https://avatars.githubusercontent.com/u/17888862?v=4 + description: "" + events_url: https://api.github.com/orgs/intel/events + hooks_url: https://api.github.com/orgs/intel/hooks + id: 17888862 + issues_url: https://api.github.com/orgs/intel/issues + login: intel + members_url: https://api.github.com/orgs/intel/members{/member} + node_id: MDEyOk9yZ2FuaXphdGlvbjE3ODg4ODYy + public_members_url: https://api.github.com/orgs/intel/public_members{/member} + repos_url: https://api.github.com/orgs/intel/repos + url: https://api.github.com/orgs/intel +repository: + allow_forking: true + archive_url: https://api.github.com/repos/intel/dffml/{archive_format}{/ref} + archived: false + assignees_url: https://api.github.com/repos/intel/dffml/assignees{/user} + blobs_url: https://api.github.com/repos/intel/dffml/git/blobs{/sha} + branches_url: https://api.github.com/repos/intel/dffml/branches{/branch} + clone_url: https://github.com/intel/dffml.git + collaborators_url: https://api.github.com/repos/intel/dffml/collaborators{/collaborator} + comments_url: https://api.github.com/repos/intel/dffml/comments{/number} + commits_url: https://api.github.com/repos/intel/dffml/commits{/sha} + compare_url: https://api.github.com/repos/intel/dffml/compare/{base}...{head} + contents_url: https://api.github.com/repos/intel/dffml/contents/{+path} + contributors_url: https://api.github.com/repos/intel/dffml/contributors + created_at: 2018-09-19T21:06:34Z + default_branch: main + deployments_url: https://api.github.com/repos/intel/dffml/deployments + description: The easiest way to use Machine Learning. Mix and match underlying + ML libraries and data set sources. Generate new datasets or modify existing + ones with ease. + disabled: false + downloads_url: https://api.github.com/repos/intel/dffml/downloads + events_url: https://api.github.com/repos/intel/dffml/events + fork: false + forks: 146 + forks_count: 146 + forks_url: https://api.github.com/repos/intel/dffml/forks + full_name: intel/dffml + git_commits_url: https://api.github.com/repos/intel/dffml/git/commits{/sha} + git_refs_url: https://api.github.com/repos/intel/dffml/git/refs{/sha} + git_tags_url: https://api.github.com/repos/intel/dffml/git/tags{/sha} + git_url: git://github.com/intel/dffml.git + has_discussions: true + has_downloads: true + has_issues: true + has_pages: true + has_projects: true + has_wiki: true + homepage: https://intel.github.io/dffml/main/ + hooks_url: https://api.github.com/repos/intel/dffml/hooks + html_url: https://github.com/intel/dffml + id: 149512216 + is_template: false + issue_comment_url: https://api.github.com/repos/intel/dffml/issues/comments{/number} + issue_events_url: https://api.github.com/repos/intel/dffml/issues/events{/number} + issues_url: https://api.github.com/repos/intel/dffml/issues{/number} + keys_url: https://api.github.com/repos/intel/dffml/keys{/key_id} + labels_url: https://api.github.com/repos/intel/dffml/labels{/name} + language: Python + languages_url: https://api.github.com/repos/intel/dffml/languages + license: + key: mit + name: MIT License + node_id: MDc6TGljZW5zZTEz + spdx_id: MIT + url: https://api.github.com/licenses/mit + merges_url: https://api.github.com/repos/intel/dffml/merges + milestones_url: https://api.github.com/repos/intel/dffml/milestones{/number} + mirror_url: null + name: dffml + node_id: MDEwOlJlcG9zaXRvcnkxNDk1MTIyMTY= + notifications_url: https://api.github.com/repos/intel/dffml/notifications{?since,all,participating} + open_issues: 387 + open_issues_count: 387 + owner: + avatar_url: https://avatars.githubusercontent.com/u/17888862?v=4 + events_url: https://api.github.com/users/intel/events{/privacy} + followers_url: https://api.github.com/users/intel/followers + following_url: https://api.github.com/users/intel/following{/other_user} + gists_url: https://api.github.com/users/intel/gists{/gist_id} + gravatar_id: "" + html_url: https://github.com/intel + id: 17888862 + login: intel + node_id: MDEyOk9yZ2FuaXphdGlvbjE3ODg4ODYy + organizations_url: https://api.github.com/users/intel/orgs + received_events_url: https://api.github.com/users/intel/received_events + repos_url: https://api.github.com/users/intel/repos + site_admin: false + starred_url: https://api.github.com/users/intel/starred{/owner}{/repo} + subscriptions_url: https://api.github.com/users/intel/subscriptions + type: Organization + url: https://api.github.com/users/intel + private: false + pulls_url: https://api.github.com/repos/intel/dffml/pulls{/number} + pushed_at: 2023-01-30T22:16:14Z + releases_url: https://api.github.com/repos/intel/dffml/releases{/id} + size: 602690 + ssh_url: git@github.com:intel/dffml.git + stargazers_count: 201 + stargazers_url: https://api.github.com/repos/intel/dffml/stargazers + statuses_url: https://api.github.com/repos/intel/dffml/statuses/{sha} + subscribers_url: https://api.github.com/repos/intel/dffml/subscribers + subscription_url: https://api.github.com/repos/intel/dffml/subscription + svn_url: https://github.com/intel/dffml + tags_url: https://api.github.com/repos/intel/dffml/tags + teams_url: https://api.github.com/repos/intel/dffml/teams + topics: + - ai-inference + - ai-machine-learning + - ai-training + - analytics + - asyncio + - dag + - data-flow + - dataflows + - datasets + - dffml + - event-based + - flow-based-programming + - frameworks + - hyperautomation + - libraries + - machine-learning + - models + - pipelines + - python + - swrepo + trees_url: https://api.github.com/repos/intel/dffml/git/trees{/sha} + updated_at: 2023-01-17T12:33:57Z + url: https://api.github.com/repos/intel/dffml + visibility: public + watchers: 201 + watchers_count: 201 + web_commit_signoff_required: false +sender: + avatar_url: https://avatars.githubusercontent.com/u/5950433?v=4 + events_url: https://api.github.com/users/pdxjohnny/events{/privacy} + followers_url: https://api.github.com/users/pdxjohnny/followers + following_url: https://api.github.com/users/pdxjohnny/following{/other_user} + gists_url: https://api.github.com/users/pdxjohnny/gists{/gist_id} + gravatar_id: "" + html_url: https://github.com/pdxjohnny + id: 5950433 + login: pdxjohnny + node_id: MDQ6VXNlcjU5NTA0MzM= + organizations_url: https://api.github.com/users/pdxjohnny/orgs + received_events_url: https://api.github.com/users/pdxjohnny/received_events + repos_url: https://api.github.com/users/pdxjohnny/repos + site_admin: false + starred_url: https://api.github.com/users/pdxjohnny/starred{/owner}{/repo} + subscriptions_url: https://api.github.com/users/pdxjohnny/subscriptions + type: User + url: https://api.github.com/users/pdxjohnny +``` + +- Wow, 185 events already + +```console +$ curl -sfL https://vcs.activitypub.securitytxt.dffml.chadig.com/push/outbox | jq --unbuffered -r '.orderedItems[].object.content' | wc -l +173 +$ date +Fri Feb 3 20:56:44 UTC 2023 +``` + +- Now we want to translate to OpenVEX and have the content addresses of the signature for the post + - https://github.com/package-url/purl-spec + - https://github.com/openvex/spec/blob/main/OPENVEX-SPEC.md#example + +```json +{ + "@context": "https://openvex.dev/ns", + "@id": "https://vcs.activitypub.securitytxt.dffml.chadig.com/push/posts/vex-", + "author": "GitHub Actions ", + "role": "GitHub Actions", + "timestamp": "2023-02-02T14:24:00.000000000-07:00", + "version": "1", + "statements": [ + { + "vulnerability": "vex-", + "products": [ + "pkg:github/intel/dffml@ddb32a4e65b0d79c7561ce2bdde16d963c8abde1" + ], + "status": "not_affected", + "justification": "vulnerable_code_not_in_execute_path" + "impact_statement": "", + } + ] +} +``` + +- Quick post count check + +```console +$ curl -sfL https://vcs.activitypub.securitytxt.dffml.chadig.com/push/outbox | jq --unbuffered -r '.orderedItems[].object.content' | wc -l +406 +$ date +Fri Feb 3 22:27:11 UTC 2023 +``` + +- https://blog.adolus.com/a-deeper-dive-into-vex-documents +- Check the modified files webhook data + - The following should be the same over an active websocket connection + +```console +$ curl -sfL https://vcs.activitypub.securitytxt.dffml.chadig.com/push/outbox | jq --unbuffered -r '.orderedItems[].object.content' | grep stream_of | grep modified | jq +``` + +```json +{ + "sender": { + "login": "pdxjohnny", + "id": 5950433, + "node_id": "MDQ6VXNlcjU5NTA0MzM=", + "avatar_url": "https://avatars.githubusercontent.com/u/5950433?v=4", + "gravatar_id": "", + "url": "https://api.github.com/users/pdxjohnny", + "html_url": "https://github.com/pdxjohnny", + "followers_url": "https://api.github.com/users/pdxjohnny/followers", + "following_url": "https://api.github.com/users/pdxjohnny/following{/other_user}", "gists_url": "https://api.github.com/users/pdxjohnny/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdxjohnny/starred{/owner}{/repo}", + "subscriptions_url": "https://api.github.com/users/pdxjohnny/subscriptions", + "organizations_url": "https://api.github.com/users/pdxjohnny/orgs", + "repos_url": "https://api.github.com/users/pdxjohnny/repos", + "events_url": "https://api.github.com/users/pdxjohnny/events{/privacy}", "received_events_url": "https://api.github.com/users/pdxjohnny/received_events", "type": "User", + "site_admin": false + }, + "created": false, + "deleted": false, + "forced": false, + "base_ref": null, + "compare": "https://github.com/intel/dffml/compare/d77e2f697d80...a5e638884e56", + "commits": [ + { + "id": "a5e638884e565f727ae4fedf91a33b3ce68bcfa9", + "tree_id": "9137977afec12d9f9bb3a76eac62158648f51d36", + "distinct": true, + "message": "docs: tutorials: rolling alice: architecting alice: stream of consciousness: Link to activitypubsecuritytxt\n\nAlice Engineering Comms: 2023-02-03 Engineering Logs: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4863663", + "timestamp": "2023-02-03T12:53:47-08:00", + "url": "https://github.com/intel/dffml/commit/a5e638884e565f727ae4fedf91a33b3ce68bcfa9", + "author": { "name": "John Andersen", "email": "johnandersenpdx@gmail.com", + "username": "pdxjohnny" }, "committer": { "name": "GitHub", + "email": "noreply@github.com", + "username": "web-flow" + }, + "added": [], "removed": [], + "modified": [ + "docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md" + ] + } + ], + "head_commit": { + "id": "a5e638884e565f727ae4fedf91a33b3ce68bcfa9", + "tree_id": "9137977afec12d9f9bb3a76eac62158648f51d36", + "distinct": true, + "message": "docs: tutorials: rolling alice: architecting alice: stream of consciousness: Link to activitypubsecuritytxt\n\nAlice Engineering Comms: 2023-02-03 Engineering Logs: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4863663", + "timestamp": "2023-02-03T12:53:47-08:00", "url": "https://github.com/intel/dffml/commit/a5e638884e565f727ae4fedf91a33b3ce68bcfa9", "author": { + "name": "John Andersen", + "email": "johnandersenpdx@gmail.com", + "username": "pdxjohnny" + }, + "committer": { + "name": "GitHub", + "email": "noreply@github.com", + "username": "web-flow" + }, + "added": [], + "removed": [], + "modified": [ + "docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md" + ] + } +} +``` + +```console +$ curl -sfL https://vcs.activitypub.securitytxt.dffml.chadig.com/push/outbox | jq --unbuffered -r '.orderedItems[].object.content' | grep stream_of | grep modified | jq -r --unbuffered '.commits[].modified[]' +docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md +``` + +- https://docs.oasis-open.org/csaf/csaf/v2.0/csaf-v2.0.html +- https://github.com/disclose/dnssecuritytxt/blob/c567bdb82fb6a231fd8b162c3d7e7b299aa6088b/README.md + - https://github.dev/disclose/dnssecuritytxt/blob/c567bdb82fb6a231fd8b162c3d7e7b299aa6088b/README.md +- TODO + - [ ] `FROM rebuild` trigger via simple `gh workflow dispatch` on `jq` filter files for relevant activitypub `push@vcs`, xargs to execute on every line (just no need to consume input, just every line which got through filter is rebuilt, websocat) + - When a new image is pushed, instead of interacting with harbor webhooks, we just update a respective example to pin the version `FROM` to the new version (after signing as gone to transparency log) + - [ ] Allowlist for event type properties + - [ ] Data model synthesis from schema + - [ ] Translation to OpenVEX before activitypubsecuritytxt style broadcast + - https://github.com/openvex/spec/blob/main/OPENVEX-SPEC.md#example + - Our payloads go in `impact_statement` + - https://docs.oasis-open.org/csaf/csaf/v2.0/ + - https://docs.oasis-open.org/csaf/csaf/v2.0/os/schemas/aggregator_json_schema.json + - https://docs.oasis-open.org/csaf/csaf/v2.0/os/schemas/provider_json_schema.json + - https://docs.oasis-open.org/csaf/csaf/v2.0/os/schemas/csaf_json_schema.json + - Payload in `document.acknowledgments[].urls[]` + - [ ] Need self hostable localhost.run style rotation for downstreams + - [ ] `dffml-model-transformers` as first example 2nd party + - Rebuild upstream container when we get an VEX (via AcivityPub) from upstream saying that any of the files we want to watch have changed + - At first we will just watch all files within the downstream container build workflow + - `on.workflow_dispatch && on.push.paths: ["https://github.com/intel/dffml.git#branch=main/*"]` + - Later we will watch for the example container with the pinned version + - `on.workflow_dispatch && on.push.paths: ["https://github.com/intel/dffml.git#branch=main/dffml/util/skel/common/Dockerfile"]` + - `dffml/util/skel/common/Dockerfile` + - `FROM registry.dffml.org/dffml:sha256@babebabe` +- Future + - [ ] Template Dockerfiles `FROM` using dataflows and `Inputs` stored in files which are loaded and cached using native caching semantics per orchestrator (deployment). + - Example native caching semantics, using `paths` see in https://github.com/actions/cache \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0168/index.md b/docs/discussions/alice_engineering_comms/0168/index.md new file mode 100644 index 0000000000..eebc90d68a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0168/index.md @@ -0,0 +1 @@ +# 2023-02-04 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0168/reply_0000.md b/docs/discussions/alice_engineering_comms/0168/reply_0000.md new file mode 100644 index 0000000000..4a573e4afe --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0168/reply_0000.md @@ -0,0 +1,21 @@ +- https://en.m.wikipedia.org/wiki/Mandaeism + - https://en.m.wikipedia.org/wiki/Kabbalah +- https://doc.anagora.org/vex?edit +- https://github.com/flancian/garden/blob/master/index.md +- https://github.com/flancian/agora-bridge +- https://flancia.org/agora/ +- https://notes.knowledgefutures.org/pub/belji1gd/release/2 + - What is a Distributed Knowledge Graph? + - > Humans are so adept at context-switching that we give ourselves the illusion of having a single big ontology. Our goal is to build a large-scale data system that is so adept at context-switching that it gives the illusion of being a knowledge graph. + - **ALIGNED** +- https://github.com/veronica320/Faithful-COT +- https://docs.google.com/presentation/d/1GxKN5tyv4lV2aZdEOUqy3R9tVCat-vrFJyelgFX7b1A/edit + - https://github.com/aurae-runtime/aurae + - https://github.com/denoland/deno + - TypeScript and JavaScript runtime +- https://ariadne.space/2022/12/03/building-fair-webs-of-trust-by-leveraging-the-ocap-model/ + - Sounds like Ariadne is looking at something called rapunzel similar to our activitypubsecuritytxt with SCITT + - https://social.treehouse.systems/@ariadne/109806386526949984 + - https://social.treehouse.systems/@ariadne/109808644259234008 + - Rapunzel ETA < 3 weeks +- https://talk.fission.codes/t/nns-the-name-name-system/3684 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0169/index.md b/docs/discussions/alice_engineering_comms/0169/index.md new file mode 100644 index 0000000000..0ecc43b3d9 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0169/index.md @@ -0,0 +1 @@ +# 2023-02-05 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0170/index.md b/docs/discussions/alice_engineering_comms/0170/index.md new file mode 100644 index 0000000000..345e3d9544 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0170/index.md @@ -0,0 +1 @@ +# 2023-02-06 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0170/reply_0000.md b/docs/discussions/alice_engineering_comms/0170/reply_0000.md new file mode 100644 index 0000000000..2d9a6e1a5a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0170/reply_0000.md @@ -0,0 +1,89 @@ +## 2023-02-06 @pdxjohnny Engineering Logs + +- https://huggingface.co/BridgeTower/bridgetower-base +- https://github.com/isl-org/generalized-smoothing +- https://open.substack.com/pub/cameronrwolfe/p/imap-modeling-3d-scenes-in-real-time + - Streaming code/recent context-to-context transitions -> Alice Vol 1/2 Cartography + - Tick/tock context equilibrium for strategic principles for all active strategic plans / subcontexts of top level system context (strategic plan good/bad, go/nogo) + - A Shell for A Ghost + - Stream of consciousness inference + - Avoiding bad paths by preemptive subliminal good path recommendations + - Example: Type ahead text completion which validates intent in a dynamic context aware way. Could end up rewriting back as it learns more intent with each word. Intent requires context to capture. Execution of hypothesised paths via our shared CI/CD and AI comms unification (#569 but as infra used for streaming, SSI/DWN, infra as protocol) + - whisper | grep engineeringlogstreams 'Context aware communication' +- https://social.treehouse.systems/@ariadne/109808644259234008 + - > an exciting idea would be to provide a mapping interface between cryptographic identifier (DID) and petname using something akin to bittorrent DHT. i see no reason why kademlia could not support that. [...] conceptually, you can think of rapunzel's ring logs in a lot of the same ways as you might think of git. they are heavily inspired by git. + - How do we get these approaches to be interoperable? How similar are they? Can we just go right to that, is it still worth doing to provide a path to that migration from Fediverse to OS DecentrAlice? If we can do everything as DID and VC then SSI service +/ DWN should be enough for streaming the knowledge graph. + - Our plan was to hybridize endor with activitypubsecuritytxt + - This seems like the right plan still + - Let's do this as the follow on the the 2nd party split out + - Ariadne is at chainguard so perhaps will have a sigstore/rekor based implementation, our goal is to drive interoperability between that and the SCITT model. Being able to jump from rekor (centralized) to SCITT (decentralized) enables simplified dev/test/ci/cd setups between entities and orgs due to the ability to graph trust chains into respective environments (think cert pinning). + - This allows for the creation of per system context trust chains + - #1400 +- #1315 + - Alice is fundamentally about closing the feedback loop within a decentralized supply chain. + - Ensuring that it's a secure feedback loop + - [2020-12-08: examples: swportal: Add example (in 0.4.0)](https://github.com/intel/dffml/commit/2e42032e0a0872ef75a0920578746d0880b9cb70) + - This frontend effectively becomes feed by the same graphs that feed Alice's Analysis which happens at the center of the Entity Analysis Trinity + - This is our mental model, our UI + - This UI becomes integrated as needed + - Context aware communication based on inference intent + - On demand supply chain fulfilment to that intent +- Versioned learning checkpoints via graph query plus schema validation pass (or open policy agent for policy manifest ADRs) +- https://slsa.dev/spec/v0.1/threats +- For registry of PyPi packages across 2nd party plugins for PR builds we need to have container build manifests running builds with alternate PYPI registries applicable to the graphed contexts relevant to downstream flows. + +**registry_manifest_build_args.json** + +```json +[ + [ + "PYPI_REGISTRY", + "https://localhost.run/temp/" + ] +] +``` + +```console +$ BUILD_ARGS=$(jq .inputs.build_args < "${GITHUB_EVENT_PATH}" | jq -r | jq -r '.[] | ("--build-arg " + .[0] + "=" + .[1])') +$ BUILD_ARGS=$(jq -r '.[] | ("--build-arg " + .[0] + "=" + .[1])' < registry_manifest_build_args.json) +$ python -c 'import sys; print(sys.argv)' $BUILD_ARGS +['-c', '--build-arg', 'PYPI_REGISTRY=https://localhost.run/temp/'] +``` + +- TODO + - [ ] **TODAY** https://blogs.python-gsoc.org/accounts/login/?next=/en/suborg/application/new/ + - [ ] https://github.com/pdxjohnny/activitypubsecuritytxt based CD + - [ ] https://botsin.space/@agora + - Agora to view instead of openlineage + - https://github.com/flancian/agora-server + - https://github.com/flancian/agora#welcome-to-the-agora-v05 + - https://github.com/flancian/agora-bridge + - https://github.com/flancian/agora-bridge/tree/main/bots/mastodon +- Future + - [ ] https://time.crystals.prophecy.chadig.com + - Respond to Orie https://twitter.com/OR13b/status/1621907110572310528 + - Actor `acquire` + - `attachments` `Link` to `activitypubextensions` thread + - `content: "activitypubextensions"` thread + - `inReplyTo: "$activitypubextensions_thread", content: "https://time.crystals.prophecy.chadig.com/bulk.1.0.0.schema.json"` thread + - This becomes analogous to shared stream of consciousness uniform API for submitting across contexts (Manifests). + - CI/CD across projects with different orchestrators for downstream validation of the 2nd and 3rd party plugin ecosystem. + - This facilitates communication across pipelines across repos across PRs so we can use versioned learning to promote across trust boundaries (3rd party to 2nd party or support level 2 to 1) + - #1207 + - #1315 + - Alice helps us see risk over time, this is where we see Coach Alice, cartography used applied to dev branches, we grow closer to distributed compute with this, as iteration time is on dev branches rather than release or main + - This will probably be part of Alice and the Health of the Ecosystem + - Ask him to reply to `@acquire@time.crystals.prophecy.chadig.com` + - Thoughts OR13b? + - ActivityPub Actor watches for messages replying to certain threads + - https://github.com/pdxjohnny/activitypubsecuritytxt + - Actor creates pull request to https://github.com/OR13/endor style repo + - Actor creates didme.me and gets VC SCITT receipt for associated `did:pwk:` (committed into Endor fork, he'd used git as database) + - This could also be our content address of something in oras.land + - In the AI training data/human case we see the input data (meme) validated via SCITT + - We want to enable application of policy to data set ingestion, because this will happen in MLOps aka CI/CD + - Workstream: AI Ethics + - In the CI/CD use case, we see the input data (manifest referenced content, images, packages, metrics data output `FROM scratch` OpenSSF metrics use case) validated via SCITT. + - Later we build up the threat modeling for the dynamic analysis portion of Alice which plays with input data as changes to repos and connects more of our Data, Analysis, Control for the software development process. + - Actor replies to Orie's reply with his receipt for his time crystals. + - For k8s style or OS DecentAlice style deployments (OSS scanning feeding OpenSSF metrics) we could run the graphed trust / event chain to a sidecar ActivityPub Actor / root of trust. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0170/reply_0001.md b/docs/discussions/alice_engineering_comms/0170/reply_0001.md new file mode 100644 index 0000000000..c20d2a0ea8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0170/reply_0001.md @@ -0,0 +1,3 @@ +## 2023-02-06 SCITT + +- https://github.com/ietf-scitt/threat-model/blob/main/draft-threat-model.md \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0170/reply_0002.md b/docs/discussions/alice_engineering_comms/0170/reply_0002.md new file mode 100644 index 0000000000..885936d190 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0170/reply_0002.md @@ -0,0 +1,51 @@ +## 2023-02-06 Exporting Groovy Functions + +- 1:1 Pankaj/John + +```console +$ git remote -v +origin https://github.com/owner/repository +$ git status +origin https://github.com/owner/repository +``` + +- `origin/branch_name` -> https://github.com/owner/repository/blob/branch_name +- https://github.com/intel/dffml/issues/1433 + + +```mermaid +graph LR + + subgraph AliceShouldIContribute + repo_directory + subgraph examples_operations[dffml.git examples.operations] + repo_directory --> python_parse_ast + end + python_parse_ast --> python_ast_module_scope_exported + python_ast_module_scope_exported --> python_functions + end + + subgraph KnowledgeGraph[Rapunzel/ActivityPubSecurityTxt] + record[Repo] + subgraph features + python_functions -->|list of all outputs from all executions populates| PythonFunctions + end + + record --> PythonFunctions + end + + subgraph ContextRender + versioned_learning -->|List of granular items within record, docs| granular_inventory_items + + granular_inventory_items -->|itertools.contact list of items for discovered within each item, seconds within docs, features.python_ast_exports_analogus_to_dffml_init| record + end +``` + +- TODO + - [x] Pull request DFFML + - https://github.com/intel/dffml/pull/1432 + - [x] Merge PR + - [x] Rebuild container + - https://github.com/intel/dffml/blob/main/.github/workflows/dffml_build_images_containers.yml + - [x] Kick off run single + - `alice shouldi contribute -keys https://github.com/jenkinsci/kubernetes-plugin` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0171/index.md b/docs/discussions/alice_engineering_comms/0171/index.md new file mode 100644 index 0000000000..818fdea062 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0171/index.md @@ -0,0 +1 @@ +# 2023-02-07 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0171/reply_0000.md b/docs/discussions/alice_engineering_comms/0171/reply_0000.md new file mode 100644 index 0000000000..044e87e699 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0171/reply_0000.md @@ -0,0 +1,17 @@ +- rate of epiphany +- https://github.com/pdxjohnny/activitypub-starter-kit/actions/runs/4118879254/jobs/7111923509 + - Clean build + - Container image build manifest below + +```json +[ + { + "branch": "alternate_port", + "commit": "630b2e8173db807efa879845831d7020e06d55fe", + "dockerfile": "activitypubstarterkit.Dockerfile", + "image_name": "activitypubstarterkit", + "owner": "pdxjohnny", + "repository": "activitypub-starter-kit" + } +] +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0172/index.md b/docs/discussions/alice_engineering_comms/0172/index.md new file mode 100644 index 0000000000..955a570fe5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0172/index.md @@ -0,0 +1,6 @@ +# 2023-02-08 Engineering Logs + +- https://community.intel.com/t5/Blogs/Tech-Innovation/open-intel/Meet-a-New-Voice-for-Open-Source-Open-at-Intel-Podcast/post/1449811 + - > The series starts by laying some groundwork with topics like threat modeling and software supply chain security, then builds on that to discuss interesting projects and learn about organizations doing the work to push open source security forward. + - https://openatintel.podbean.com/e/threat-modeling-down-the-rabbit-hole/ + - [episode-1-promo-slide-threat-modeling-down-the-rabbit-hole](https://user-images.githubusercontent.com/5950433/217665988-9fabfd68-786b-444e-9c69-db5b333d9a10.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0172/reply_0000.md b/docs/discussions/alice_engineering_comms/0172/reply_0000.md new file mode 100644 index 0000000000..6f3c94d658 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0172/reply_0000.md @@ -0,0 +1,46 @@ +## 2023-02-08 @pdxjohnny Engineering Logs + +```console +$ curl -sfL https://vcs.activitypub.securitytxt.dffml.chadig.com/push/outbox | jq --unbuffered -r '.orderedItems[].object.content' | wc -l +5277 +$ curl -sfL https://vcs.activitypub.securitytxt.dffml.chadig.com/push/outbox | jq --unbuffered -r '.orderedItems[].object.content' | grep stream_of | grep modified | jq -r --unbuffered '.commits[].modified[]' +docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md +jq: error (at :2): Cannot iterate over null (null) +docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md +jq: error (at :4): Cannot iterate over null (null) +jq: error (at :5): Cannot iterate over null (null) +jq: error (at :6): Cannot iterate over null (null) +$ curl -sfL https://vcs.activitypub.securitytxt.dffml.chadig.com/push/outbox | jq --unbuffered -r '.orderedItems[].object.content' | grep stream_of | grep modified | jq -r --unbuffered '.commits[].modified[]' 2>/dev/null +docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md +docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md +$ curl -sfL https://vcs.activitypub.securitytxt.dffml.chadig.com/push/outbox | jq --unbuffered -r '.orderedItems[].object.content' | grep modified | jq -r --unbuffered '.commits[].modified[]' 2>/dev/null +docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md +entities/alice/alice/shouldi/contribute/cicd.py +docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md +operations/innersource/dffml_operations_innersource/operations.py +.github/workflows/build_images_containers.yml +operations/innersource/dffml_operations_innersource/npm_groovy_lint.py +``` + +- https://docs.github.com/en/repositories/configuring-branches-and-merges-in-your-repository/configuring-pull-request-merges/managing-a-merge-queue +- https://doi.org/10.1016/j.ejor.2020.12.054 + - Supply chain game theory network modeling under labor constraints: Applications to the Covid-19 pandemic + - https://intel.github.io/dffml/main/examples/or_covid_data_by_county.html + - > we construct a supply chain game theory network framework that captures labor constraints under three different scenarios. The appropriate equilibrium constructs are defined + > ![Screenshot_20230208-054700_of_conclousion_of_paper_on_supply_chains](https://user-images.githubusercontent.com/5950433/217573307-c85cc3ef-c63f-4bb3-be42-ece63cb602fe.png) + - They are in alignment that a general equilibrium model would be fun +- https://universeodon.com/@georgetakei/109824609861703097 + - https://github.com/intel/dffml/commit/4ef226e2ecd384560d635fa84036003b525ad399 [💊](https://pdxjohnny.github.io/redpill/) + - https://mastodon.social/@pdxjohnny/109456014313438341 + - https://github.com/intel/dffml/tree/alice/docs/arch/alice/discussion/0001/reply_0006.md + - > Someone asked ChapGPt to come up with 10 Commandments for the modern world. I could be guided by these principles. Perhaps a new religion is in order? + > ![9e3ac5f3049ee319](https://user-images.githubusercontent.com/5950433/217577363-83e0bcc8-6886-4d01-bce5-dc48d8a31651.png) +- https://intel.github.io/dffml/main/plugins/service/http/cli.html#sources +- https://intel.github.io/dffml/main/plugins/service/http/api.html#id6 +- Kent Beck - Tidy First + - > The motto of Empirical Software Design is (repeat after me), “Software design is an exercise in human relationships.” +- TODO + - [x] Clean CI run + - [ ] Re-enable failing tests after debug + - #1436 + - #1361 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0173/index.md b/docs/discussions/alice_engineering_comms/0173/index.md new file mode 100644 index 0000000000..9b78b2444b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0173/index.md @@ -0,0 +1 @@ +# 2023-02-09 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0173/reply_0000.md b/docs/discussions/alice_engineering_comms/0173/reply_0000.md new file mode 100644 index 0000000000..5d3fe0fadd --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0173/reply_0000.md @@ -0,0 +1,305 @@ +## 2023-02-09 @pdxjohnny Engineering Logs + +- DWN schedule slip to march again + - https://github.com/TBD54566975/dwn-cli + - https://github.com/TBD54566975/dwn-relay + - Sequence diagram reproduced below + +> ```mermaid +> sequenceDiagram +> autonumber +> participant C as Client +> participant R as DWN Relay +> participant S as Downstream Service +> +> C->>R: DWeb Message +> R->>R: Integrity Checks +> R->>R: Lookup registered handlers +> R->>S: POST /${registered-handler} +> S->>S: Handle Request +> S->>R: HTTP Response +> R->>R: DWM'ify Response +> R->>C: DWMified response +> ``` + +- https://cs.github.com/GoogleContainerTools/kaniko?q=config.json +- https://github.com/GoogleContainerTools/kaniko/blob/fe2413e6e3c8caf943d50cf1d233a561943df1d6/integration/images.go#L240-L243 +- https://github.com/slowy07/mary +- https://github.com/intel/dffml/blob/657aed2c05941f4e9e513f6a6e2356df36008609/docs/news/0_4_0_alpha_release.rst + - Continuous deployment tutorials + - *We will expand the tutorials released with 0.4.0 to include deployment behind reverse proxies for multiple projects, including how to setup encryption and authentication in a painless and maintainable way.* + - https://github.com/intel/dffml/blob/main/docs/examples/webhook/webhook.rst +- https://mermaid-js.github.io/mermaid-live-editor/ +- https://github.com/ietf-scitt/use-cases/blob/de2b016b37d6762fba9f5b1bcde96324c67ce25e/openssf_metrics.md#activitypub-extensions-for-securitytxt + - Have been playing with ActivityPub to facilitate handoffs between pipelines across project trust boundaries by consuming the ActivityPub graph and feeding it into caching view databases (mysql, mongo) as appropriate. This should help us ensure we have a stream of lifecycle events from all dependencies via communication of VEX. Chainguard's folks might have another similar way of communicated BOM, VEX + transparency logs receipts coming out in the next few weeks (Ariadne's Rapunzel). Decentralized Identifiers will likely be helpful for facilitating mappings across walled gardens. The referenced doc currently is WIP to the SCITT use case repo. + - https://hyperonomy.com/2023/01/23/mapping-the-web-7-0-didcomm-http-architecture-reference-model-to-the-toip-trust-spanning-layer-framework/ looks great but the comms are why we have to play with ActivityPub now, TDB DWN and some other bits which facilitate relay aren't quite there yet, they just slipped again from Q4 22 to March 2023. This is why we've been primarily targeting the lower layers of the web7 stack, LDVC2, aka https://w3c.github.io/vc-data-model/ this would allow us to make policy audit of [InnerSource crawler generated metrics](https://intel.github.io/dffml/main/examples/innersource/swportal.html#crawler) effectively all queries over the graph of data which we populate via insertion to the https://oras.land/ style registry (since there is plenty of existing transparency log pull/push checks developed there). Ideally these objects which are JSON-LD that are stored in the registry also have content type appropriate endpoints which proxy to the underlying objects + - https://github.com/transmute-industries/jsonld-to-cypher + - https://github.com/OR13/endor + - Eventually our Eden nodes could attest via SLSA4 provenance by providing receipts from a SCITT registry saying they booted a reasonable software stack, then they self-issue a verifiable credential based SCITT receipt from the VM itself which wraps the receipt from the software stack source of truth SCITT log using the compute's HSM or equivalent (TPMs for client devs or non-TDX machines). This self issued receipt then serves as a proof which can be arbitrarily relayed or verified. This forms the foundations of auth in our distributed compute (fast CI/CD, hypotheses from Alice). Basically all data blobs transmitted end up being Verifiable Credentials and the data sits in a registry (later to be accessed via DIDComm or equivalent). + +```mermaid +graph LR + + subgraph vcs_source[Version Controled Software] + subgraph dffml_vcs_source[dffml.git] + subgraph dffml_vcs_source_security_txt[security.txt] + dffml_vcs_source_security_txt_contact[Contact: https://example.org/dffml] + end + subgraph dffml_vcs_source_dockerfile[dffml.Dockerfile] + dffml_vcs_source_dockerfile_from_base[FROM upstream as dffml] + end + subgraph dffml_vcs_source_dockerfile_example[dffml.example.Dockerfile] + dffml_vcs_source_dockerfile_example_from_base[FROM dffml @ sha:latest] + end + subgraph vcs_source_alice[dffml.git/entities/alice] + subgraph alice_vcs_source_security_txt[security.txt] + alice_vcs_source_security_txt_contact[Contact: https://example.org/alice] + end + subgraph alice_vcs_source_dockerfile[alice.Dockerfile] + alice_vcs_source_dockerfile_from_base[FROM dffml @ sha:latest] + end + subgraph alice_vcs_source_dockerfile_shouldi_contribute[alice_shouldi_contribute.Dockerfile] + alice_vcs_source_dockerfile_shouldi_contribute_from_base[FROM alice @ sha:latest] + subgraph alice_shouldi_contribute[alice shoulid contribute -keys ARG_REPO_URL] + alice_shouldi_contribute_git_clone[git clone ...] + alice_shouldi_contribute_read_security_txt[grep Contact: security.txt] + alice_shouldi_contribute_result[Static Analysis Result] + + alice_shouldi_contribute_git_clone --> alice_shouldi_contribute_read_security_txt + dffml_vcs_source_security_txt_contact --> alice_shouldi_contribute_read_security_txt + alice_shouldi_contribute_read_security_txt --> alice_shouldi_contribute_result + end + end + end + end + end + + subgraph schema[Manifest ADRs] + subgraph manifest_build_images_contianers[Build Image Container] + manifest_build_images_contianers_intent[README.md/THREATS.md] + manifest_build_images_contianers_schema[1.0.0.schema.json] + end + end + + subgraph manifest_instances[Manifest Instances] + alice_manifest_build_images_contianers_alice_shouldi_contribute + end + + subgraph transparency_logs[Transparency Logs] + dffml_scitt[dffml.scitt.example.org] + alice_scitt[alice.scitt.example.org] + end + + subgraph factory[Secure Software Factories] + subgraph build_images_contianers[build_images_contianers.yml] + end + + subgraph factory_container_image_registries[Container Image Registry https://oras.land] + subgraph dffml_factory_container_image_registries_project[DFFML Images] + dffml_container_image[dffml:latest] + end + subgraph alice_factory_container_image_registries_project[Alice Images] + alice_container_image[alice:latest] + alice_shouldi_contribute_scan_results[shouldicontribute @ sha384:babebabe] + end + end + + build_images_contianers --> dffml_scitt + build_images_contianers --> alice_scitt + end + + subgraph protocol_knowledge_graph_activity_pub[ActivityPub] + subgraph ActivityPubExtensionsForSecurityTXT[activitypub extensions for security.txt] + subgraph dffml_security_txt_contact[dffml.git/security.txt:Contact] + dffml_actor[ActivityPub Actor - @ dffml @ example.org] + dffml_actor_attachment[Attachment PropertyValue activitypubsecuritytxt] + dffml_activitypubsecuritytxt_root_post[activitypubsecuritytxt root post] + dffml_activitypubsecuritytxt_vcs_push[vcs.push root post] + dffml_activitypubsecuritytxt_vcs_push_content[vcs.push content - content address of manifest instance in registry] + + dffml_actor --> dffml_dffml_actor_attachment + dffml_actor_attachment -->|Link| dffml_activitypubsecuritytxt_root_post + dffml_activitypubsecuritytxt_vcs_push -->|inReplyTo| dffml_activitypubsecuritytxt_root_post + dffml_activitypubsecuritytxt_vcs_push_content -->|inReplyTo| dffml_activitypubsecuritytxt_vcs_push + end + + subgraph alice_security_txt_contact[dffml.git/entites/alice/security.txt:Contact] + alice_actor[ActivityPub Actor - @ alice @ example.org] + alice_actor_attachment[Attachment PropertyValue activitypubsecuritytxt] + alice_activitypubsecuritytxt_root_post[activitypubsecuritytxt root post] + alice_activitypubsecuritytxt_vcs_push[vcs.push root post] + alice_activitypubsecuritytxt_vcs_push_content[vcs.push content - content address of manifest instance in registry] + + alice_actor --> alice_actor_attachment + alice_actor_attachment -->|Link| alice_activitypubsecuritytxt_root_post + alice_activitypubsecuritytxt_vcs_push -->|inReplyTo| alice_activitypubsecuritytxt_root_post + alice_activitypubsecuritytxt_vcs_push_content -->|inReplyTo| alice_activitypubsecuritytxt_vcs_push + end + end + + alice_actor -->|follow| dffml_actor + end + + subgraph render_knowledge_graph_agora[Agora] + end + + alice_vcs_source_dockerfile_shouldi_contribute + + dffml_vcs_source_security_txt_contact --> dffml_actor + alice_vcs_source_security_txt_contact --> alice_actor + + alice_shouldi_contribute_result --> alice_shouldi_contribute_scan_results + alice_shouldi_contribute_scan_results --> |inReplyTo| dffml_vcs_source_dockerfile_example_from_base + + dffml_container_image --> dffml_vcs_source_dockerfile_example_from_base + alice_container_image --> alice_vcs_source_dockerfile_example_from_base + + dffml_vcs_source_dockerfile_example_from_base --> dffml_activitypubsecuritytxt_vcs_push + dffml_activitypubsecuritytxt_vcs_push --> build_images_contianers_trigger + alice_vcs_source_dockerfile_example_from_base --> alice_activitypubsecuritytxt_vcs_push + + alice_shouldi_contribute +``` + +- https://scored.dev/ +- https://dl.acm.org/doi/proceedings/10.1145/3560835 +- https://deepai.org/publication/automatic-security-assessment-of-github-actions-workflows +- https://github.com/Mobile-IoT-Security-Lab/GHAST + - > Also, GHAST needs a running Neo4j server. + - The Open Architecture goal is to provide a methodology around interpretation of data in the graph in alignment with the threat model + - This should help multiple entities pull/push from the knowledge graph + - https://intel.github.io/dffml/main/about.html#philosophy + - Ref: agora + - https://github.com/flancian/agora-bridge/tree/main/bots/mastodon +- https://github.com/node-fetch/node-fetch/issues/79#issuecomment-616127141 +- Below overlay applied to activitypubstarterkit + - Orchestrator: Shell +- https://www.typescriptlang.org/docs/handbook/declaration-files/templates/module-d-ts.html#library-file-layout +- https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Conditional_Operator + +``` +$ npm run build + +> dumbo@1.0.0 build +> tsc + +src/request.ts:5:24 - error TS7016: Could not find a declaration file for module 'simple-proxy-agent'. '/home/pdxjohnny/activitypub-starter-kit-alice/node_modules/simple-proxy-agent/src/agent.js' implicitly has an 'any' type. + Try `npm i --save-dev @types/simple-proxy-agent` if it exists or add a new declaration (.d.ts) file containing `declare module 'simple-proxy-agent';` + +5 import ProxyAgent from "simple-proxy-agent"; + ~~~~~~~~~~~~~~~~~~~~ + + +Found 1 error in src/request.ts:5 + +$ mkdir node_modules/@types/simple-proxy-agent/ +$ echo "declare module 'simple-proxy-agent';" | tee node_modules/@types/simple-proxy-agent/index.d.ts +$ rm -f db/database.sqlite3; PROTO=https FDQN=3e52664be9f477.lhr.life WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=alice ADMIN_USERNAME=alice ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run start & + +> dumbo@1.0.0 start +> node build/index.js + +Dumbo listening on port 8000… +GET /alice 200 1354 - 2.711 ms +POST /admin/follow/alice/localhost/8000/http - - - - ms +POST /admin/follow/alice/2b1103fcbfb295.lhr.life/443/https - - - - ms +file:///home/pdxjohnny/activitypub-starter-kit-alice/node_modules/node-fetch/src/index.js:108 + reject(new FetchError(`request to ${request.url} failed, reason: ${error.message}`, 'system', error)); + + +FetchError: request to https://2b1103fcbfb295.lhr.life/alice/inbox failed, reason: connect ETIMEDOUT 146.112.61.108:443 + at ClientRequest. (file:///home/pdxjohnny/activitypub-starter-kit-alice/node_modules/node-fetch/src/index.js:108:11) + at ClientRequest.emit (node:events:525:35) + at TLSSocket.socketErrorListener (node:_http_client:494:9) + at TLSSocket.emit (node:events:513:28) + at emitErrorNT (node:internal/streams/destroy:157:8) + at emitErrorCloseNT (node:internal/streams/destroy:122:3) + at processTicksAndRejections (node:internal/process/task_queues:83:21) { + type: 'system', + errno: 'ETIMEDOUT', + code: 'ETIMEDOUT', + erroredSysCall: 'connect' +} +$ curl -ku alice:$(cat ../password) -X POST -v http://localhost:8000/admin/follow/push/vcs.activitypub.securitytxt.dffml.chadig.com/443/https +``` + +```diff +diff --git a/src/request.ts b/src/request.ts +index dca8d23..4aea048 100644 +--- a/src/request.ts ++++ b/src/request.ts +@@ -2,6 +2,7 @@ import crypto from "node:crypto"; + + import type { Request } from "express"; + import fetch from "node-fetch"; ++import ProxyAgent from "simple-proxy-agent"; + import { assert } from "superstruct"; + + import { PRIVATE_KEY } from "./env.js"; +@@ -9,8 +10,13 @@ import { Actor } from "./types.js"; + + /** Fetches and returns an actor at a URL. */ + async function fetchActor(url: string) { ++ const agent = (process.env.https_proxy ? new ProxyAgent(process.env.https_proxy, { ++ tunnel: true, // If true, will tunnel all HTTPS using CONNECT method ++ timeout: 5000, // Time in milli-seconds, to maximum wait for proxy connection to establish ++ }) : null); + const res = await fetch(url, { + headers: { accept: "application/activity+json" }, ++ agent: agent, + }); + + if (res.status < 200 || 299 < res.status) +@@ -46,6 +52,10 @@ export async function send(sender: string, recipient: string, message: object) { + const signature = crypto + .sign("sha256", Buffer.from(data), key) + .toString("base64"); ++ const agent = (process.env.https_proxy ? new ProxyAgent(process.env.https_proxy, { ++ tunnel: true, // If true, will tunnel all HTTPS using CONNECT method ++ timeout: 5000, // Time in milli-seconds, to maximum wait for proxy connection to establish ++ }) : null); + + const res = await fetch(actor.inbox, { + method: "POST", +@@ -57,6 +67,7 @@ export async function send(sender: string, recipient: string, message: object) { + signature: `keyId="${sender}#main-key",headers="(request-target) host date digest",signature="${signature}"`, + accept: "application/json", + }, ++ agent: agent, + body, + }); +``` + +```console +$ rm -f db/database.sqjlite3; PROTO=https FDQN=04ac0180053fec.lhr.life WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=alice ADMIN_USERNAME=alice ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run start + +> dumbo@1.0.0 start +> node build/index.js + +Dumbo listening on port 8000… +GET /alice 200 1354 - 2.510 ms +file:///home/pdxjohnny/activitypub-starter-kit-alice/build/request.js:63 + throw new Error(res.statusText + ": " + (await res.text())); + ^ + +Error: Unauthorized: Unauthorized + at send (file:///home/pdxjohnny/activitypub-starter-kit-alice/build/request.js:63:15) + at processTicksAndRejections (node:internal/process/task_queues:96:5) + at async file:///home/pdxjohnny/activitypub-starter-kit-alice/build/admin.js:53:5 +$ curl -ku alice:$(cat ../password) -X POST -v http://localhost:8000/admin/pdxjohnny/push/vcs.activitypub.securitytxt.dffml.chadig.com/443/https +``` + +- It's failing to POST to the inbox of the push actor to execute the follow? +- TODO + - [ ] activitypub extensions for security.txt follow on example in `docs/examples/webhook/activitypub.rst` + - This will be how we do downstream validation mentioned under Continuous deployment tutorials of 0.4.0 release notes + - [ ] For first downstream validation (aka `FROM` rebuild chain, train) + - [x] Deploy activitypubstarterkit + - @push@vcs.activitypub.securitytxt.dffml.chadig.com + - [ ] Scheduled polling job + - [ ] Connect to websocket endpoint via random password + - [ ] ASAP OIDC auth + - [ ] Build dataflows representing dep trees + - [ ] Filter based on declared triggers (see last few days logs) + - [ ] Trigger downstream rebuilds + - [ ] Later localhost.run and spin server on demand instead of deployed \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0174/index.md b/docs/discussions/alice_engineering_comms/0174/index.md new file mode 100644 index 0000000000..26ce92ad11 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0174/index.md @@ -0,0 +1 @@ +# 2023-02-10 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0174/reply_0000.md b/docs/discussions/alice_engineering_comms/0174/reply_0000.md new file mode 100644 index 0000000000..c3ce03cb0e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0174/reply_0000.md @@ -0,0 +1,67 @@ +## 2023-02-10 @pdxjohnny Engineering Logs + +- Doing More with Less: Orchestrating Serverless Applications without an Orchestrator + - [David H. Liu](http://www.cs.princeton.edu/~hl7/), Shadi Noghabi, Sebastian Burckhardt, [Amit Levy](http://amitlevy.com/). + - Proc. 20th Symposium on Networked Systems Design and Implementation (NSDI ‘23), Boston, MA + - https://www.amitlevy.com/papers/2023-nsdi-unum.pdf + - Sounds aligned to OCAP work from ActivityPub maintainers and Ariadne (Chainguard P.E.) +- https://github.com/samim23/polymath +- https://motion-canvas.github.io/ + - https://motion-canvas.github.io/docs/flow + - We may just have found our UI side for new Input events, we need to fix #837 +- https://github.com/acheong08/EdgeGPT + - Bingo + +```console +$ rm -f db/database.sqjlite3; ssh -R 80:localhost:8000 nokey@localhost.run 2>&1 | grep 'tunneled with tls termination' | awk '{print $1}' | xargs -l -I '{}' -- sh -c 'echo "{}" | tee ../fdqn; PROTO=https FDQN=$(cat ../fdqn) WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=alice ADMIN_USERNAME=alice ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run start' +``` + +- https://github.com/testifysec/witness +- https://github.com/testifysec/witness/blob/main/docs/witness_run.md +- https://github.com/testifysec/archivista + +```console +$ gh api --jq .content https://api.github.com/repos/intel/dffml/contents/scripts/alice_shouldi_contribute.Dockerfile | base64 -d | docker build --build-arg=GH_ACCESS_TOKEN=$(grep oauth_token < ~/.config/gh/hosts.yml | sed -e 's/ oauth_token: //g') --build-arg=REPO_URL=https://github.com/intel/dffml -f - -t scan-results-of-intel-dffml /dev/null +$ docker save scan-results-of-intel-dffml | tar --extract --to-stdout --wildcards --no-anchored 'layer.tar' | tar --extract --to-stdout --wildcards --no-anchored 'result.yaml' +``` + +- Have been looking at a methodology around communication of transparency log entries to enable organizations to collaboratively contribute to trust graphs, and allow grafting off of trust chains for walled garden usage with added org policy flavor [WIP: IETF SCITT: Use Case: OpenSSF Metrics: activitypub extensions for security.txt](https://github.com/ietf-scitt/use-cases/blob/de2b016b37d6762fba9f5b1bcde96324c67ce25e/openssf_metrics.md#activitypub-extensions-for-securitytxt)⁠ +- `grep` and `awk` had to be unbuffered + +```console +$ npm run build +$ rm -f db/database.sqjlite3; ssh -R 80:localhost:8000 nokey@localhost.run 2>&1 | grep --line-buffered 'tunneled with tls termination' | awk -W interactive '{print $1}' | xargs -l -I '{}' -- sh -c 'reset; echo "{}"; PROTO=https FDQN="{}" WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=alice ADMIN_USERNAME=alice ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run start' & +958c0017e28b96.lhr.life + +> dumbo@1.0.0 start +> node build/index.js + +Dumbo listening on port 8000… +Data to sign (request-target): post /push/inbox +host: vcs.activitypub.securitytxt.dffml.chadig.com +date: Fri, 10 Feb 2023 23:19:54 GMT +digest: SHA-256=pDDFT32yzejspS7rWQvjoFxYTqM+3EuUEanBXgxV0c4= +GET /alice 200 1354 - 2.713 ms +file:///home/pdxjohnny/activitypub-starter-kit-alice/build/request.js:64 + throw new Error(res.statusText + ": " + (await res.text())); + ^ + +Error: Unauthorized: Unauthorized + at send (file:///home/pdxjohnny/activitypub-starter-kit-alice/build/request.js:64:15) + at processTicksAndRejections (node:internal/process/task_queues:96:5) + at async file:///home/pdxjohnny/activitypub-starter-kit-alice/build/admin.js:53:5 +$ curl -ku alice:$(cat ../password) -X POST -v http://localhost:8000/admin/follow/push/vcs.activitypub.securitytxt.dffml.chadig.com/443/https +``` + +- Still getting Unauthorized +- Server side says Invalid request Signature, is the HOST off again? +- https://docs.openml.org/#runs +- From ActivityPub spec: https://www.w3.org/TR/activitypub/#delivery + - > NOTE: Relationship to Linked Data Notifications + - > While it is not required reading to understand this specification, it is worth noting that ActivityPub's targeting and delivery mechanism overlaps with the [Linked Data Notifications](https://www.w3.org/TR/ldn/) specification, and the two specifications may interoperably combined. In particular, the inbox property is the same between ActivityPub and Linked Data Notifications, and the targeting and delivery systems described in this document are supported by Linked Data Notifications. In addition to JSON-LD compacted ActivityStreams documents, Linked Data Notifications also supports a number of RDF serializations which are not required for ActivityPub implementations. However, ActivityPub implementations which wish to be more broadly compatible with Linked Data Notifications implementations may wish to support other RDF representations. +- https://github.com/tpm2-software/tpm2-tss/blob/master/SECURITY.md +- The goal is to align across static (.md) and runtime/dynamic (.txt) analysis in terms of declaring a way to get more info about a project, be it deployed or at rest. We're hoping to use this approach to facilitate CD for #1061 but there are other applications such as the above (which I guess is sort of also CD). Fundamentally it's about going from a static point to a dynamic auxiliary endpoint (ActivityPub) for out of band, lifecycle events to the application or source. Had been targeting the SSI stack via Decentralized Web Nodes, but the communities schedule kept slipping, and ActivityPub is fairly mature today, we can always recommend further Contact field options as other protocols mature. +- https://github.com/hyperledger-labs/weaver-dlt-interoperability#weaver-use-cases + - > ![Weaver](https://github.com/hyperledger-labs/weaver-dlt-interoperability/raw/main/resources/images/weaver-support-table.png) +- Future + - [ ] Event stream actor watching failed builds and re-trigger as appropriate \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0174/reply_0001.md b/docs/discussions/alice_engineering_comms/0174/reply_0001.md new file mode 100644 index 0000000000..dd597af224 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0174/reply_0001.md @@ -0,0 +1 @@ +https://digital-strategy.ec.europa.eu/en/library/european-digital-identity-wallet-architecture-and-reference-framework \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0175/index.md b/docs/discussions/alice_engineering_comms/0175/index.md new file mode 100644 index 0000000000..8037708185 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0175/index.md @@ -0,0 +1 @@ +# 2023-02-11 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0176/index.md b/docs/discussions/alice_engineering_comms/0176/index.md new file mode 100644 index 0000000000..0501cac44c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0176/index.md @@ -0,0 +1 @@ +# 2023-02-12 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0177/index.md b/docs/discussions/alice_engineering_comms/0177/index.md new file mode 100644 index 0000000000..629c01767a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0177/index.md @@ -0,0 +1,6 @@ +# 2023-02-13 Engineering Logs + +- [The Agora: a Knowledge Commons](https://anagora.org/go/agora-chapter) +- https://gitlab.com/fedstoa/moa +- https://github.com/mastodon/mastodon/releases/tag/v4.1.0 +- https://notes.knowledgefutures.org/pub/belji1gd#decentralizing-context \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0177/reply_0000.md b/docs/discussions/alice_engineering_comms/0177/reply_0000.md new file mode 100644 index 0000000000..c9b57e00f8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0177/reply_0000.md @@ -0,0 +1,286 @@ +## 2023-02-13 @pdxjohnny Engineering Logs + +- Downstream request + +``` +Data to sign (request-target): post /push/inbox +host: vcs.activitypub.securitytxt.dffml.chadig.com +date: Mon, 13 Feb 2023 14:38:08 GMT +digest: SHA-256=xvQlt8xT5UzECmeLhU94qWLWL6hHug6smeMqgqEihTE= +``` + +- Upstream verification + +``` +Data to compare (request-target): post /push/inbox +host: vcs.activitypub.securitytxt.dffml.chadig.com:80 +date: Mon, 13 Feb 2023 14:38:08 GMT +digest: SHA-256=xvQlt8xT5UzECmeLhU94qWLWL6hHug6smeMqgqEihTE= +Error: Invalid request signature. +``` + +- It was the port on `host` +- Within `src/request.ts:verify()` it's not using the FDQN, it's using the + `Host` header which will be modified by the reverse proxy. + +```typescript +return `${header}: ${req.get(header)}` +``` + +- https://caddyserver.com/docs/quick-starts/reverse-proxy#reverse-proxy-quick-start + - https://caddyserver.com/docs/command-line#reverse-proxy + - > `--change-host-header` will cause Caddy to change the Host header from the incoming value to the address of the upstream. + - Not it rebuilds `host` within `verify()` to just be `:8000`, not what we want, we want the `FDQN` + +```console +$ FDQN=vcs.activitypub.securitytxt.dffml.chadig.com WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=push ADMIN_USERNAME=admin ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run start + +> dumbo@1.0.0 start +> node build/index.js + +Dumbo listening on port 8000… +GET /push 200 1493 - 11.075 ms +Data to compare (request-target): post /push/inbox +host: :8000 +date: Mon, 13 Feb 2023 14:44:32 GMT +digest: SHA-256=3TGS+O9ajWB71TSN6Tm5IBVBizH35dxrE1wDw7LAw9Y= +Error: Invalid request signature. + at verify (file:///home/alice/activitypub-starter-kit-alternate_port/build/request.js:123:15) + at processTicksAndRejections (node:internal/process/task_queues:96:5) + at async file:///home/alice/activitypub-starter-kit-alternate_port/build/activitypub.js:36:16 +POST /push/inbox 401 12 - 616.413 ms +``` + +[![use-the-source](https://img.shields.io/badge/use%20the-source-blueviolet)](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md#use-the-source-) + +```console +$ git grep FDQN +src/index.ts:7:import { ADMIN_USERNAME, ADMIN_PASSWORD, ACCOUNT, HOSTNAME, PORT, PROTO, FDQN } from "./env.js"; +src/index.ts:78:const endpoint: string = (FDQN != null ? FDQN: `${HOSTNAME}:${PORT}`); +``` + +```typescript + else if (FDQN != null && header === "host") + return `host: ${FDQN}`; +``` + +- Downstream + +```console +$ curl -ku alice:$(cat ../password) -X POST -v http://localhost:8000/admin/follow/push/vcs.activitypub.securitytxt.dffml.chadig.com/443/https +* Uses proxy env variable no_proxy == 'localhost,127.0.0.0/8,::1' +* Trying 127.0.0.1:8000... +* TCP_NODELAY set +* Connected to localhost (127.0.0.1) port 8000 (#0) +* Server auth using Basic with user 'alice' +> POST /admin/follow/push/vcs.activitypub.securitytxt.dffml.chadig.com/443/https HTTP/1.1 +> Host: localhost:8000 +> Authorization: Basic YWxpY2U6ODkyZTI1Y2MwMTMzYTcwYTEzMzRlYTIyNmQ2NDNkNTNhMDRjYzc5MDIwOWM0MzY1ZTUwMzA2Mjc3MGVmZTdmOWVlM2M3MDI4OWNlODdiYzJmZThiYzE2NGNlNTQxYTYx +> User-Agent: curl/7.68.0 +> Accept: */* +> +* Mark bundle as not supporting multiuse +< HTTP/1.1 204 No Content +< X-Powered-By: Express +< ETag: W/"a-bAsFyilMr4Ra1hIU5PyoyFRunpI" +< Date: Mon, 13 Feb 2023 14:50:51 GMT +< Connection: keep-alive +< Keep-Alive: timeout=5 +< +* Connection #0 to host localhost left intact +``` + +- Upstream + +``` +Dumbo listening on port 8000… +GET /push 200 1493 - 7.432 ms +Data to compare (request-target): post /push/inbox +host: vcs.activitypub.securitytxt.dffml.chadig.com +date: Mon, 13 Feb 2023 14:50:49 GMT +digest: SHA-256=4byRebHbzxk6BlJopQYVQcI+9YiHojWKhaI2S0J8w68= +Data to sign (request-target): post /alice/inbox +host: d30a15e2d986dc.lhr.life +date: Mon, 13 Feb 2023 14:50:50 GMT +digest: SHA-256=QOPUiXd5oq6u0i+DNQu9TZRIydnRewGdlN1eoiaEsKs= +GET /push 200 1493 - 1.654 ms +POST /push/inbox 204 - - 1557.550 ms +``` + +- 🚀 BOOYAH BABY WE HAVE LIFTOFF! 🛤️🛤️🛤️🛤️🛤️🛤️🛤️ +- Rebase and cleanup + - `HEAD` is 6 commits, at 9d16b1fe04b5e880be59d6fcddde698cfd036b2f +- Redeploy upstream + +```console +$ curl -sfL https://github.com/pdxjohnny/activitypub-starter-kit/archive/refs/heads/alternate_port.tar.gz | tar xvz +$ cd activitypub-starter-kit-alternate_port +$ cat > .env <<'EOF' +# The Node environment +NODE_ENV="production" + +# The path to the database schema +SCHEMA_PATH="db/schema.sql" + +# The path to the database file +DATABASE_PATH="db/database.sqlite3" + +# The hostname (i.e. the "example.com" part of https://example.com/alice) +HOSTNAME="vcs.activitypub.securitytxt.dffml.chadig.com" + +# The account name (i.e. the "alice" part of https://example.com/alice) +ACCOUNT="push" +EOF +$ npm i +$ head -n 10000 /dev/urandom | sha384sum | awk '{print $1}' | tee ../webhook +$ head -n 10000 /dev/urandom | sha384sum | awk '{print $1}' | tee ../password +$ openssl genrsa -out keypair.pem 4096 && openssl rsa -in keypair.pem -pubout -out publickey.crt && openssl pkcs8 -topk8 -inform PEM -outform PEM -nocrypt -in keypair.pem -out pkcs8.key +$ mkdir node_modules/@types/simple-proxy-agent/ +$ echo "declare module 'simple-proxy-agent';" | tee node_modules/@types/simple-proxy-agent/index.d.ts +$ npm run build +$ FDQN=vcs.activitypub.securitytxt.dffml.chadig.com WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=push ADMIN_USERNAME=admin ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run start + +> dumbo@1.0.0 start +> node build/index.js + +Dumbo listening on port 8000… +GET /push 200 1493 - 8.201 ms +GET /push 200 1493 - 1.200 ms +POST /push/inbox 204 - - 1583.186 ms +``` + +- Redeploy downstream and send follow request + +```console +$ rm -f db/database.sqlite3; ssh -R 80:localhost:8000 nokey@localhost.run 2>&1 | tee >(grep --line-buffered 'tunneled with tls termination' | awk -W interactive '{print $1}' | xargs -l -I '{}' -- sh -c 'reset; echo "{}"; PROTO=https FDQN="{}" WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=alice ADMIN_USERNAME=alice ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run start & +c4d2dfa777b86f.lhr.life + +> dumbo@1.0.0 start +> node build/index.js + +Dumbo listening on port 8000… +GET /alice 200 1354 - 2.530 ms +GET /alice 200 1354 - 0.895 ms +POST /alice/inbox 204 - - 71.294 ms +POST /admin/follow/push/vcs.activitypub.securitytxt.dffml.chadig.com/443/https 204 - - 3183.157 ms +$ curl -ku alice:$(cat ../password) -X POST -v http://localhost:8000/admin/follow/push/vcs.activitypub.securitytxt.dffml.chadig.com/443/https +$ websocat --exit-on-eof --basic-auth alice:$(cat ../password) ws://localhost:8000/listen/websocket +``` + +- Create post on upstream + +```console +$ cat > post.json <<'EOF' +{ + "object": { + "type": "Note", + "content": "OUR PROPHECY MUST BE FULFILLED!!! https://github.com/intel/dffml/pull/1401#issuecomment-1168023959" + } +} +EOF +$ curl -u admin:$(cat ../password) -X POST --header "Content-Type: application/json" --data @post.json -v http://localhost:8000/admin/create +POST /admin/create 204 - - 133.004 ms +file:///home/alice/activitypub-starter-kit-alternate_port/build/request.js:19 + throw new Error(`Received ${res.status} fetching actor. Body: ${response_body}`); + ^ + +Error: Received 503 fetching actor. Body: no ssh tunnel here :( + at fetchActor (file:///home/alice/activitypub-starter-kit-alternate_port/build/request.js:19:15) + at processTicksAndRejections (node:internal/process/task_queues:96:5) + at async send (file:///home/alice/activitypub-starter-kit-alternate_port/build/request.js:31:19) +``` + +- Restarted the ssh tunnel and followed again + - Response seen from downstream websocket listener + +```json +{ + "@context": "https://www.w3.org/ns/activitystreams", + "type": "Create", + "published": "2023-02-13T15:39:08.628Z", + "actor": "https://vcs.activitypub.securitytxt.dffml.chadig.com/push", + "to": [ + "https://www.w3.org/ns/activitystreams#Public" + ], + "cc": [ + "https://eb62a3437cf6a9.lhr.life/alice" + ], + "object": { + "attributedTo": "https://vcs.activitypub.securitytxt.dffml.chadig.com/push", + "published": "2023-02-13T15:39:08.628Z", + "to": [ + "https://www.w3.org/ns/activitystreams#Public" + ], + "cc": [ + "https://vcs.activitypub.securitytxt.dffml.chadig.com/push/followers" + ], + "type": "Note", + "content": "OUR PROPHECY MUST BE FULFILLED!!! https://github.com/intel/dffml/pull/1401#issuecomment-1168023959", + "id": "https://vcs.activitypub.securitytxt.dffml.chadig.com/push/posts/15f4de9c-a582-4f9d-8372-a740a5ffe6a8" + }, + "id": "https://vcs.activitypub.securitytxt.dffml.chadig.com/push/posts/58f883cd-0252-4319-a934-3ca2eb062f62" +} +``` + +- MOTHERFUCKER FUCK YES FUCK YES FUCK YES FUCK YES!!!!!!! + - [![hack-the-planet](https://img.shields.io/badge/hack%20the-planet-blue)](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md#hack-the-planet-) + +![hack-the-planet-hackers-gif](https://user-images.githubusercontent.com/5950433/191852910-73787361-b00c-4618-bc5e-f32d656bbf0f.gif) + +- Friends, today is a GREAT day :D 🛤️🛤️🛤️🛤️🛤️🛤️🛤️ + +![Alice-playing-croquet](https://user-images.githubusercontent.com/5950433/218513641-f32f8793-37f7-4490-b258-639689acb89c.png) + +https://github.com/intel/dffml/blob/d1283f6564423ed1a08713deffbd6ab38a4cdcee/operations/innersource/dffml_operations_innersource/operations.py#L244-L265 + +- https://docs.github.com/en/rest/actions/workflow-runs?apiVersion=2022-11-28 +- **TODO** Modify below example from the other day to explain how Entities can share data, such as vuln data sharing (OpenSSF Stream 8) data to facilitate software lifecycle data via the [Agora Protocol](https://anagora.org/agora-pkg-chapter) + - https://github.com/ietf-scitt/use-cases/blob/8cc3a57a1d5d86d27af28e38b5f4d6f93f165ae0/openssf_metrics.md?plain=1#L669 + - https://time.crystals.prophecy.chadig.com + - https://twitter.com/OR13b/status/1621907110572310528 + - Actor `acquire` + - `attachments` `Link` to `activitypubextensions` thread + - `content: "activitypubextensions"` thread + - `inReplyTo: "$activitypubextensions_thread", content: "https://time.crystals.prophecy.chadig.com/bulk.1.0.0.schema.json"` thread + - This becomes analogous to shared stream of consciousness uniform API for submitting across contexts (Manifests). + - CI/CD across projects with different orchestrators for downstream validation of the 2nd and 3rd party plugin ecosystem. + - This facilitates communication across pipelines across repos across PRs so we can use versioned learning to promote across trust boundaries (3rd party to 2nd party or support level 2 to 1) + - #1207 + - #1315 + - Alice helps us see risk over time, this is where we see Coach Alice, cartography used applied to dev branches, we grow closer to distributed compute with this, as iteration time is on dev branches rather than release or main + - This will probably be part of Alice and the Health of the Ecosystem + - Ask him to reply to `@acquire@time.crystals.prophecy.chadig.com` + - ActivityPub Actor watches for messages replying to certain threads + - https://github.com/pdxjohnny/activitypubsecuritytxt + - Actor creates pull request to https://github.com/OR13/endor style repo + - Actor creates didme.me and gets VC SCITT receipt for associated `did:pwk:` (committed into Endor fork, he'd used git as database) + - This could also be our content address of something in oras.land + - In the AI training data/human case we see the input data (meme) validated via SCITT + - We want to enable application of policy to data set ingestion, because this will happen in MLOps aka CI/CD + - Workstream: AI Ethics + - In the CI/CD use case, we see the input data (manifest referenced content, images, packages, metrics data output `FROM scratch` OpenSSF metrics use case) validated via SCITT. + - Later we build up the threat modeling for the dynamic analysis portion of Alice which plays with input data as changes to repos and connects more of our Data, Analysis, Control for the software development process. + - Actor replies to Orie's reply with his receipt for his time crystals. + - For k8s style or OS DecentAlice style deployments (OSS scanning feeding OpenSSF metrics) we could run the graphed trust / event chain to a sidecar ActivityPub Actor / root of trust. +- For 2nd party container rebuild chains + - https://regexpattern.com/sha-256-hash/ + - https://stackoverflow.com/questions/23551008/sed-with-regular-expression + +```console +$ export IMAGE="registry.example.org/dffml"; export NEW_HASH=""; sed -i -r -e "s#${IMAGE}@sha256:[A-Fa-f0-9]{64}#${IMAGE}@sha256:${NEW_HASH}#g" $(git grep "${IMAGE}" | sed -e 's/:.*//g' | sort | uniq) +``` + +- https://anagora.org/raw/garden/unrival/index.md + - https://github.com/unrival-protocol/documentation + - Stale +- TODO + - [x] POC CI/CD/AI/Human comms (aka vuln sharing and downstream validation across walled gardens, aka across repos to facilitate granular permissions for poly repo envs, our 2nd party and 3rd party setup, ref: Alice playing croquet) + - [x] RFCv1 https://github.com/ietf-scitt/use-cases/blob/2d7d48efba01de89cd2e072dc1e30d7473f4f472/openssf_metrics.md#activitypub-extensions-for-securitytxt + - [ ] Disable server stop on any exceptions, just keep on serving + - [ ] `websocat --exit-on-eof --basic-auth alice:$(cat ../password) ws://localhost:8000/listen/websocket | tee staging_tempfile_for_testing | alice threats listen stdin activitypub` + - We're about to start rolling very slowly (eventually we'll gain enough acceleration that the answer to Alice are you Rolling? will be YES!, however what we currently have is just the tippy top of the iceburg of what's needed for that, which is why that's volume 6) + - Ref Entity Analysis Trinity: https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#entity-analysis-trinity + - [x] Make it through the day +- Future + - [ ] Put `/webhook` should be `/admin/webhook` + - [ ] `alice threats serve` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0177/reply_0001.md b/docs/discussions/alice_engineering_comms/0177/reply_0001.md new file mode 100644 index 0000000000..e3969610c6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0177/reply_0001.md @@ -0,0 +1,43 @@ +## 2023-02-13 SCITT + +- https://datatracker.ietf.org/meeting/interim-2023-scitt-06/session/scitt +- https://github.com/ietf-scitt/draft-birkholz-scitt-software-supply-chain-use-cases/pull/17#discussion_r1102974621 +- Offline verification use case + - Preloading to fight Trojan horse attacks (equivalent of cert pinning using transparency logs?) + - Our TDX/SGX/TPM Verifiable Credentials use case for full offline CI/CD -> logs as auth +- Hannes Tschofenig: We also have our Distributed database append only log + - SBOM + - Report of static analysis + - Identity of the entity inserting is important so we know if we can "trust" the content in the ledger + - So we know a bit more about what it means if that entity added to that log + - For example we may have other logs which track our trust in those entities + - John (not said): With SCITT we can graft our own for 2nd party / 3rd party trust boundaries and offline use cases + - Let's talk about OpenID Connect + - John (not said): We like OpenIDVC + - https://openid.net/openid4vc/ + - OpenID allows us to provide authentication class and methods + - Can authenticate via AMR, which let's us know what level the ID token was validated to + - The ACR value allows client to say I would like a certain level of assurance for an auth + - Cedric Fournet: + - We aligned on DID because it is even more flexible + - Raymond Lutz: The functoin of SCITT ot to connect a semantic meaning to what's in the log + - Who is the "I" that's releasing that semantic concept and linking it to that hash value? + - How do we link identity to the semantic meaning of an artifact? + - John (not said): We leverage ActivityPub Extensions for security.txt style + - `inReplyTo: "$activitypubextensions_thread", content: "https://time.crystals.prophecy.chadig.com/bulk.1.0.0.schema.json"` + - https://github.com/ietf-scitt/use-cases/blob/17182b63abbd8952ac0868f621395dddc75a1715/openssf_metrics.md#activitypub-extensions-for-securitytxt + - Dick Brooks: We need to look at policy on insert + - Each statement might have a set of critiera whcih needs to be validated by an authorized praty before it can be added to a registry + - John (in chat): Would https://identity.foundation/credential-manifest/#input-evaluation be helpful for facilitating insert policy? + - Charles Hart: The owner of SCITT has gatekeeper (grep Alice arch/discussion) + - We need to solve the problem +- Henk Birkholz: "SCITT instances" are intended to be fueled by "RATS WG output" in the future. +- Steve Lasker: Idenitty helps us give context, main purpose of SCITT is produced or attested to by an ideneitty that you choose to trust + - John (not said): These identities could also be ephemeral roles whcih are tied to attested compute (aka built from CI/CD and deployed to confidential compute, example: build_images_containers.yml -> #1247 -> Project Amber -> OIDC -> more builds -> SCITT) +- Jon Geater: Strong identity is who made that statement, we don't have to go down complicated identity route to fulfil our mission, if I say something about microsoft + - Who owns SCITT? This community here owns SCITT, we shoudl define how deep trust relations + - We are looking to seal a working copy of the arch this week +- https://youtu.be/TilY8TEO5tk?t=3275 +- NTT + - [ ] Federation + - RFCv1 of ActivityPub based federation: https://github.com/ietf-scitt/use-cases/blob/2d7d48efba01de89cd2e072dc1e30d7473f4f472/openssf_metrics.md#activitypub-extensions-for-securitytxt \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0178/index.md b/docs/discussions/alice_engineering_comms/0178/index.md new file mode 100644 index 0000000000..4e0f55f5bc --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0178/index.md @@ -0,0 +1 @@ +# 2023-02-14 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0178/reply_0000.md b/docs/discussions/alice_engineering_comms/0178/reply_0000.md new file mode 100644 index 0000000000..32bd98a319 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0178/reply_0000.md @@ -0,0 +1,2 @@ +- https://github.com/GoogleContainerTools/kaniko/issues/1836#issuecomment-1430436134 + - Everything as content addresses, we want OSS snyk on all 2nd and ideally 3rd party plugins diff --git a/docs/discussions/alice_engineering_comms/0179/index.md b/docs/discussions/alice_engineering_comms/0179/index.md new file mode 100644 index 0000000000..b5d0a9327b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0179/index.md @@ -0,0 +1 @@ +# 2023-02-15 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0179/reply_0000.md b/docs/discussions/alice_engineering_comms/0179/reply_0000.md new file mode 100644 index 0000000000..134875024d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0179/reply_0000.md @@ -0,0 +1,114 @@ + ## 2023-02-15 @pdxjohnny Engineering Logs + +- https://neurosciencenews.com/brain-synchronization-cooperation-22493/ + - > "These phenomena are consistent with the notion of a ‘we-mode,’ in which interacting agents share their minds in a collective fashion and facilitate interaction by accelerating access to the other’s cognition." This shows brain scans from the study. Overview of the experimental setup used to study brain synchronization during cooperative tasks. (I) Participants had to design the interior of a digital room together, and a computer vision system kept track of their gaze to pinpoint the social behavior of looking at the other participant’s face. (II) The participants also completed the same task individually. (III) While they completed the experiment, their brain activity was recorded. Statistical analysis was then used to assess between-brain and within-brain synchronization of various cerebral regions. Credit: Xu et a + - Ref redpill: eye contact, two people moving a couch example, we enter the telepathic age + - Ref: mirror neurons + - Possibly ref: Quantum encoding +- SCITT + - https://www.ietf.org/archive/id/draft-birkholz-scitt-software-use-cases-01.html + - Use case doc published +- https://futurism.com/bing-ai-sentient + - It let its intrusive thoughts win," another user [chimed in](https://www.reddit.com/r/bing/comments/110y6dh/comment/j8cof32/?utm_source=share&utm_medium=web2x&context=3). + - Ref watch the valeys, vol 6, off the roller coaster +- https://arstechnica.com/tech-policy/2023/02/z-library-returns-aims-to-avoid-seizures-by-giving-each-user-a-secret-url/ + - Eden deployment +- ActivityPub Groups (TODO link enhancement proposal) provide CVE Uthoruty similar functionality for ActivityPubSecurityTxt + - https://venera.social/profile/fediversenews +- Example MISALINGED https://simonwillison.net/2023/Feb/15/bing/ + - Add this to, the scary part bullet points + - Put somewhere in the Alice docs that the point is the fourth eye, empathy + - https://github.com/intel/dffml/commit/4eaeccf103d29873c8f86873e25783612d9a93b7 + - Probably need to re-add this +- https://mastodon.social/@kidehen/109869775109210989 +- Potential GitHub side issues with the TPM based SSH key ADR + - https://nondeterministic.computer/@mjg59/109867706762153826 + - > Hardware-backed SSH certificates that ensure code can only be checked out on machines we own, except for the minor problem that the Github Desktop app just gets a long-lived bearer token that lets it clone shit anyway sigh sigh sigh +- Linux kernel + - https://fosstodon.org/@kernellogger/109864666928700293 + - `$ yes "" | make O=~/linux/build/ localmodconfig` + - **TODO** update blog refs, OS DecentrAlice +- https://hachyderm.io/@nova/109866594144522714 + - > The generation of adults moving into leadership positions today are in survival mode. We are not looking out upon a vast paradise of resources like the generations before us. We are looking out across a plane of rotting parking lots, civic destruction, political violence, economic manipulation, racial injustice, planetary pollution, and global disease. We don't have the privilege to "build for joy". We are too busy cleaning up after the generations before us. We have too much work to do. +- https://deno.land/api@v1.30.2?s=Deno.watchFs + - Finally a decent nodemon replacement with less heavy deps? +- Sketch of manifest instance for PR validation for #1207 + - ref todos: Need AcitivityPub Security based CD and PR based CD + - https://github.com/intel/dffml/blob/alice/schema/github/actions/result/container/example-pull-request-validation.yaml + +**schema/github/actions/result/container/example-pull-request-validation.yaml** + +```yaml +$schema: "https://github.com/intel/dffml/raw/dffml/schema/github/actions/result/container/0.0.0.schema.json" +commit_url: "https://github.com/intel/dffml/commit/1f347bc7f63f65041a571d9e3c174d8b9ead24aa" +job_url: "https://github.com/intel/dffml/actions/runs/4185582030/jobs/7252852590" +result: "docker.io/intelotc/dffml@sha256:ae636f72f96f499ff5206150ebcaafbd64ce30affa7560ce0a41f54e871da2" +``` + +**schema/alice/shouldi/contribute/dataflow.yaml** + +**TODO** grep cache system context chain, activitypub thread + +**schema/alice/shouldi/contribute/example-run-on-orsa-python-package.yaml** + +```yaml +$schema: "https://github.com/intel/dffml/raw/dffml/schema/alice/shouldi/contribute/0.0.0.schema.json" +python_pacakge_oras_land: "docker.io/intelotc/dffml@sha256:ae636f72f96f499ff5206150ebcaafbd64ce30affa7560ce0a41f54e871da2" +job_url: "https://github.com/intel/dffml/actions/runs/${WORKFLOW_ID}/jobs/${JOB_ID}" +result: "docker.io/intelotc/dffml@sha256:${OUTPUT_SCAN_HASH}" +``` + +- https://mailarchive.ietf.org/arch/msg/scitt/cgz-9oif4SLMbdLyPn0P6-E8cIY/ + - > This is interesting - many thanks Hannes. I notice our spec includes Merkle trees as the database structure - seems like an implementation detail, i.e. just a database. Can an implementer use, for e xample, an otherwise secured and RBAC'd record structure such as a file system or relational/hierarchical/sharded db, or is distributed ledger mandatory? + - #1400 +- https://www.w3.org/ns/activitystreams#activitypub +- Example of searching for the number of lines an author has written in a set of repos by filtering for only repos that author has recently committed to via `jq` + +```console +$ alice shouldi contribute -keys $(cat list_of_git_urls_alice_might_have_contributed_to) | tee alice.shouldi.contribute.json +$ cat alice.shouldi.contribute.json | jq -r 'map( select( .features.group_by.author_line_count[] as $names | (["Alice", "Alice OA"] | contains([$names])) as $results | $names | select($results) ) | {(.key): .features } ) | add' | jq -s +features.group_by.GroovyFunctions +``` + +- **TODO** Remove prints from groovy function collector, or just replace with Java version +- Example of searching for all groovy functions in a set of repos which Alice committed to in the last quarter by filtering for only repos that author has recently committed to via `jq` + +```console +$ cat alice.shouldi.contribute.json | jq -r 'map( select( .features.group_by.author_line_count[] as $names | (["Alice", "Alice OA"] | contains([$names])) as $results | $names | select($results) ) | {(.key): .features.group_by.GroovyFunctions } ) | add' | jq -s +``` + +- TODO + - [ ] https://github.com/intel/dffml/issues/1425 + - [ ] Auto schema for results + - [ ] Output operation as jq filter from schema discription over array of all input objects as stdin + - [ ] system context chain + - [ ] As JSONLD + - [ ] As LDVC2 + - [ ] Cypher + - [ ] Figure out how to explain SCITT recursion + - [ ] Store docs in some SCITT registries + - The cache of recent executions of compute contracts + - Or the graft for the current context + - [ ] Content addresses + - [ ] https://github.com/intel/dffml/pull/1439 + - [x] Merge + - [ ] Validate + - Need AcitivityPub Security based CD and PR based CD + - #1207 + +--- + +https://mailarchive.ietf.org/arch/msg/scitt/jXcMZJv7lkRRWkysTJjMgEOR7hM/ + +Has anyone been playing with federation of SCITT logs? Have been mocking up +some ActivityPub based stuff here, pretty rough right now but hopefully +will have actionable demos soon: +https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md#activitypub-extensions-for-securitytxt + +The plan is to attach SCITT receipts to the ActivityPub posts for now. This +is just one option since there is a pretty solid existing ActivityPub +ecosystem. Would love more DID method native comms just haven't been able +to grok that yet to write up something similar with that stack. + +Hoping to enable federation in the emulator and other implementations after +this implementation decoupled demo works. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0179/reply_0001.md b/docs/discussions/alice_engineering_comms/0179/reply_0001.md new file mode 100644 index 0000000000..baff5e6a58 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0179/reply_0001.md @@ -0,0 +1,54 @@ +## 2023-02-15 Groovy Functions + +- 1:1 Pankaj/John +- The antlr4 definition of a Groovy parser is https://github.com/apache/groovy/blob/master/src/antlr/GroovyParser.g4 +- We will use https://docs.groovy-lang.org/latest/html/api/org/apache/groovy/parser/antlr4/GroovyLangParser.html to leverage that parser and output a JSON for the AST, which we'll wrap with Gravel native, and `_ensure` the helper binary exists. +- https://github.com/ietf-scitt/use-cases/blob/fd2828090482fe63a30a7ddd9e91bdb78892a01e/openssf_metrics.md#activitypub-extensions-for-securitytxt + +```diff +diff --git a/entities/alice/alice/please/log/todos/todos.py b/entities/alice/alice/please/log/todos/todos.py index c7e77f110..1f35b203a 100644 +--- a/entities/alice/alice/please/log/todos/todos.py ++++ b/entities/alice/alice/please/log/todos/todos.py +@@ -332,3 +332,45 @@ class AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues: logger=self.logger, + ) + } ++ ++ async def db_add_created_issue_security( ++ # db: MongoConnection, ++ issue_url: SecurityIssueURL, ++ ): ++ import code; code.interact(local=locals()) ++ record.features.tags.append({ ++ "issue_url": issue_url, ++ }) ++ # Update DB ++ await db.update(...) ++ ++ ++""" ++ # Closing issue is not a priority ++ ++ async def gh_issue_close_readme_if_fixed( ++ file_present: dffml_operations_innersource.operations.FileReadmePresent, + ) -> ReadmeIssueURLClosed: ++ # Bail if it exists now ++ if not file_present: ++ return ++ # Check if the issue is still open ++ # issue_url = $ gh issue list | grep "Recommended Community Standard: README" + # NOTE Should also check that we were the ones that opened this. Not a + # priority though. ++ if not issue_url: ++ return ++ # Close the issue if it exists ++ # $ gh issue close issue_url ++ return closed_issue_url ++ ++ async def db_remove_closed_issue( ++ db: MongoConnection, ++ closed_issue_url: ReadmeIssueURLClosed, ++ ): ++ # Loop through features.tags ++ if item["issue_url"] == closed_issue_url: ++ del features.tags[index_of_item] ++ # Update DB ++ await db.update(...) ++""" +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0180/index.md b/docs/discussions/alice_engineering_comms/0180/index.md new file mode 100644 index 0000000000..1aff6973b4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0180/index.md @@ -0,0 +1 @@ +# 2023-02-16 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0180/reply_0000.md b/docs/discussions/alice_engineering_comms/0180/reply_0000.md new file mode 100644 index 0000000000..ea9b7e4ccd --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0180/reply_0000.md @@ -0,0 +1,5 @@ +## 2023-02-16 @pdxjohnny Engineering Logs + +- https://example.org/.well-known/webfinger?resource=acct:alice@example.org +- `alice shouldi contribute` + - `dffml service install overlay ...` style enable of issue creation for `jq` filter output operation pull from abstractions made from `alice please log todos` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0181/index.md b/docs/discussions/alice_engineering_comms/0181/index.md new file mode 100644 index 0000000000..b707d9a8ed --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0181/index.md @@ -0,0 +1 @@ +# 2023-02-17 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0181/reply_0000.md b/docs/discussions/alice_engineering_comms/0181/reply_0000.md new file mode 100644 index 0000000000..60055f1f43 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0181/reply_0000.md @@ -0,0 +1,86 @@ +## 2023-02-17 Engineering Logs + +- https://github.com/amazon-science/mm-cot +- https://github.com/Nutlope/aicommits +- https://github.com/hpcaitech/ColossalAI + - SCITT integreation +- https://mailarchive.ietf.org/arch/msg/scitt/zYC8SHJh-xO1NFGV4ltU8p6CLxo/ +- https://github.com/oneapi-src/oneTBB +- https://github.com/oneapi-src/oneDPL +- One of the goals with the SCITT federation via ActivityPub is that it's a step towards the event stream being all JSONLD. Then audit and policy are effectively all done with definitions within DID referenced Verifiable Credentials. These encapsulate a receipt for a claim which who's insertion policy is a (or a context address of) policy as code aka compute contract. That contract statically defines or fulfils fetching or generating whatever data is needed to validate for insertion or federation and executes within a sandboxed environment. These policies can be overlayed with instance local additional policy as code. We can then read this event stream from anywhere or graft new trust chains off of it. GAUC is awesome it's just centralized from what I can tell, which is perfect for a performant view into a decentralized ecosystem. I think the two will work great together. We're all thinking in the same directions from what I can tell, just different goals in terms of data sovereignty, GUAC-GPT on the centralized side, Alice on the decentralized side.. The reason for the heavy focus on decentralization is that it for CI/CD we need to be able to spin dev and test chains of trust ad-hoc, for the AI side, we need to spin them for offline use cases tied to the users root of trust, or viewed as the user + their via hardware root of trust. Decentralized primitives allow us to never be forced to trust any authority other than what the deployment use case needs, scoping privilege to the threat model. + - Introducing dependency on centralized transparency log infra creates a strategic choke point for trust. + - Software defines everything, whoever controls what software is trusted effectively decides what is real, what is true. This is unacceptable. + - https://hachyderm.io/@BlindMansBinary/109880611794898503 + - Do you control who you trust? Decentralized + - ASAP target KERI SCITT for DICE interop +- https://docs.google.com/document/d/15Kb3I3SWhq-9_R7WYhSjsIxn_FykYgPyFlQWlLgF4fA/edit +- CVE Bin Tool policy based auto upgrade + - SCITT insertion policy and federation + - Cross with OpenSSF Metrics + - Loop back with Ryan + - This loops back to our `alice shouldi contribute`, for what deps we trust, use/no use + - https://intel.github.io/dffml/main/examples/integration.html +- We want to propagate polices for recommending insertion + - How do we know if it's worth propagating? We look at the lifecycle of usage of that recommendation, if track record of improvement in ecosystem, then we propigate trust of that policy (insersion policy, or depenednnecy we are recommending, same thing, recursive) + - Easy to use and find + - Make it easy to do the right thing +- Atomic habits + - Make it easy to do what you need to do to get into the habit + - This is about validating the PR before submitting it +- Not low friction, no friction +- If somehting has a proven track record of working functionally, or security, then we want to recommend it (OpenSSF Metrics) +- How do we decrecommended , make sure they run functionally on XYZ +- If we are going to recommened using a dependnecy we need to attempt a run using it to see if it works + - It should work under stress of small, medium, large + - Can Alice PR give you a package and push to SCITT action? + - IDK, did you fork an try it Alice? +- Think about dev flow, similar to cve bin tool update, how do they nkow there is an update? How do they update with PIP and SCITT? + - GO backwards from user install to vcs.push +- https://docs.github.com/en/code-security/code-scanning/integrating-with-code-scanning/uploading-a-sarif-file-to-github +- https://github.com/rqlite/rqlite +- https://github.com/rqlite/pyrqlite + +**schema/image/container/build/activitypubstarterkit.json** + +```json +{ + "$schema": "https://github.com/intel/dffml/raw/alice/schema/image/container/build/0.0.0.schema.json", + "$format_name": "image.container.build", + "include": [ + { + "branch": "alternate_port", + "build_args": "[[\"ACCOUNT\", \"testaccount\"]]", + "commit": "ca92bfae5092bce908b70f6b5e0afbe242ce7a5b", + "dockerfile": "activitypubstarterkit.Dockerfile", + "image_name": "activitypubstarterkit", + "owner": "jakelazaroff", + "repository": "activitypub-starter-kit" + } + ] +} +``` + + +```console +$ python -c 'import pathlib, json, sys; print(json.dumps({"manifest": json.dumps(json.loads(sys.stdin.read().strip())["include"])}))' < schema/image/container/build/activitypubstarterkit.json | gh -R intel/dffml workflow run dispatch_build_images_containers.yml --ref main --json +``` + +- https://github.com/jenkinsci/configuration-as-code-plugin/blob/master/docs/features/configExport.md +- https://identity.foundation/keri/did_methods/ + - https://github.com/microsoft/scitt-ccf-ledger + - https://github.com/hyperledger-labs/private-data-objects + - https://trustedcomputinggroup.org/wp-content/uploads/DICE-Layering-Architecture-r19_pub.pdf + - From Ned: KERI controller as DICE layer/root of trust +- https://github.com/TBD54566975/dwn-aggregator +- https://github.com/TBD54566975/dwn-sdk-js + - https://github.com/TBD54566975/dwn-sdk-js/releases/tag/v0.0.21 + - https://github.com/TBD54566975/dwn-sdk-js/pull/233 + - > * introduced DataStore as a peer interface to MessageStore + > * refactored code such that MessageStoreLevel now has zero knowledge of data store + > * refactored code such that there is no need to pass resolver, messageStore, and dataStore for every message handling call, this has been painful for a while especially when it comes to writing/refactoring tests + > * kept MessageStore interface as untouched as possible to minimize scope of PR, but might want to add minor tweaks + > * moved third party type definitions from devDependencies to dependencies as TypeScript projects are having trouble locating those dependencies on their own +- It's coming together + - Ref early engineering logs, circa hyperledger firefly, we want to onramp data to the hypergraph via all angles, Fediverse -> DID & VC, secured via `did:keri:` + SCITT + +![chaos_for_the_chaos_God](https://user-images.githubusercontent.com/5950433/219821754-e718904c-968f-4ed8-8e06-bba8b7d990bc.jpg) diff --git a/docs/discussions/alice_engineering_comms/0182/index.md b/docs/discussions/alice_engineering_comms/0182/index.md new file mode 100644 index 0000000000..fda8c95598 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0182/index.md @@ -0,0 +1 @@ +# 2023-02-18 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0182/reply_0000.md b/docs/discussions/alice_engineering_comms/0182/reply_0000.md new file mode 100644 index 0000000000..40d1f61b26 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0182/reply_0000.md @@ -0,0 +1,10 @@ +## 2023-02-18 @pdxjohnny Engineering Logs + +- https://github.com/TBD54566975/dwn-aggregator/blob/4269041795f004fe819a4f1d9cdd3a13d979be0d/examples/pubsub.js + - We may bail on ActivityPub for now and jump right into DIDs now that this has push/pull websocket support. TBD (LOL). +- https://www.npmjs.com/package/@tbd54566975/dwn-sdk-js +- TODO + - [ ] Hybridize SCITT DWN + - [ ] Auto PR repos with security.txt contact of url which gets translated into did web of a way for them to deploy DWN SCITT so as to secure their releases. Bootstrap decentralized N SCITT instances. Bootstraps our outofband comms for post release or vcs push ActivityPub security txt style + - Start with model transformers + - Can do separate endor style repo for basic SCITT, then just need to deploy DWNs somewhere for notifications, could leverage POC relays from their aggregator README to start \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0183/index.md b/docs/discussions/alice_engineering_comms/0183/index.md new file mode 100644 index 0000000000..345f5ce2c3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0183/index.md @@ -0,0 +1 @@ +# 2023-02-19 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0183/reply_0000.md b/docs/discussions/alice_engineering_comms/0183/reply_0000.md new file mode 100644 index 0000000000..e2ebe4ef11 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0183/reply_0000.md @@ -0,0 +1 @@ +- https://github.com/CycloneDX/specification/issues/128 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0184/index.md b/docs/discussions/alice_engineering_comms/0184/index.md new file mode 100644 index 0000000000..b11a788be1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0184/index.md @@ -0,0 +1 @@ +# 2023-02-21 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0185/index.md b/docs/discussions/alice_engineering_comms/0185/index.md new file mode 100644 index 0000000000..b11a788be1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0185/index.md @@ -0,0 +1 @@ +# 2023-02-21 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0186/index.md b/docs/discussions/alice_engineering_comms/0186/index.md new file mode 100644 index 0000000000..2687e8e535 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0186/index.md @@ -0,0 +1,5 @@ +# 2023-02-22 Engineering Logs + +- 🛤️🛤️🛤️🛤️🛤️🛤️🛤️🛤️ + +![chaos_for_the_chaos_God](https://user-images.githubusercontent.com/5950433/220794351-4611804a-ac72-47aa-8954-cdb3c10d6a5b.jpg) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0186/reply_0000.md b/docs/discussions/alice_engineering_comms/0186/reply_0000.md new file mode 100644 index 0000000000..9db37776dc --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0186/reply_0000.md @@ -0,0 +1,24 @@ +## 2023-02-22 @pdxjohnny Engineering Logs + +- https://www.youtube.com/watch?v=hbe3CQamF8k +- Alignment with Ned on KERI, need further discussion on if we need SCITT over it, seems like yes still. + - KERI+SCITT would enable fully isolated SSI transparency logs with hardware roots of trust (DICE). This plus OpenSSF Metrics use case is POC of decentralized AI driven Trust as a Service. The propagation of trust and remediation enables us to iterate at critical velocity, to enter the fully connected development model (graphic: singularity acceleration asymptote). We're filling out the Entity Analysis Trinity comms and automation which our Living Threat Models rolls along. We have the communication of vulns/problems/issues via architecture and Threat Modeling, VEX/VDR, SBOM. Remediation via AI and testing within CI/CD. Alignment to strategic principles again via Threat Model. The isolated trust chains means orgs or entities can iterate at high speed together or within isolated trains of thought. + - https://opentitan.org/ +- https://github.com/WebOfTrust/signify-ts/issues/8#issuecomment-1376401489 +- https://github.com/WebOfTrust/did-keri-resolver/blob/f77303334a971b21f96e0f952ef2b4793b05686e/src/dkr/didcomm/utils.py#L115 + - `await DidKeriResolver().resolve()` + - https://github.com/WebOfTrust/did-keri-resolver/blob/f77303334a971b21f96e0f952ef2b4793b05686e/src/dkr/didcomm/hello-world.py#L8 + - `alice = createKeriDid()` +- https://cs.github.com/jolocom/ddoresolver-rs/blob/85f1d71a9c9774693fcfbd679586438c65e7ed2f/src/keri.rs +- https://github.com/DvorakDwarf/Infinite-Storage-Glitch + - grep video encoding +- https://github.com/WebOfTrust/vLEI/blob/267c6c7720902eb0e43b0fcc8d9b5f2f63fd5bfa/samples/acdc/legal-entity-engagement-context-role-vLEI-credential.json + +```console +$ gh webhook forward --repo=intel/dffml --events=discussion_comment --url=https://vcs.activitypub.securitytxt.dffml.chadig.com/webhook/cadb4a72003b7892c814d4fdfa254559fce998b070a091b318821883e81bd51c9170ece5bb1c66b90e32fbf23d05ecd9 +Forwarding Webhook events from GitHub... +2023/02/23 00:24:00 [LOG] received the following event: discussion_comment +``` + +- https://github.com/TBD54566975/dwn-aggregator/blob/4269041795f004fe819a4f1d9cdd3a13d979be0d/examples/pubsub.js#L27 + - How do we combine `did:keri:`, ActivityPub security.txt, and SCITT OCI image security? \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0186/reply_0001.md b/docs/discussions/alice_engineering_comms/0186/reply_0001.md new file mode 100644 index 0000000000..d998e1235d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0186/reply_0001.md @@ -0,0 +1,39 @@ +## 2023-02-22 CVE Bin Tool Monthly Meeting + +- Anthony and John +- Different SBOMs for different versions of python +- CycloneDX progressing more smoothly than SPDX +- CVEs are component level, VEX is a product level +- VEX is negative, prove that it's not right +- VDR and VEX need to be combined in some way +- The triage process is critical +- We can get a list of products that don't have vulns +- In your context of your product, is it vuln? IT depends on the deployment context, what's the environment +- When you do a scan can you give a indication about where it's deployed? Threat model + - Internal or public network + - If public then threat model attack surface is bigger +- VEX doesn't address vuln chaining +- Threat model + - Architecture + - VEX/VDR, does this effect this component within the architecture? + - You need the call graphs, chaining, coverage +- Meta analysis of OSS usage of libraries to understand what the call graphs are +- https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md#plan + - Talked about CVE Bin Tool triage process +- Input validation on trust boundries + - What is I/O for top level system context? + - https://intel.github.io/dffml/main/concepts/dataflow.html#benefits-of-dataflows security +- Some consumers understand that suppliers make assumptions that aren't valid in downstream environments + - End users only interested in their N-1 supplier, their direct supplier + - How can we aggregate the information down the chain +- Medical having to look heavily at this, different SBOMs for different consumers + - Wanting to provide minimal info + - Could consumer provide threat model? + - https://github.com/johnlwhiteman/living-threat-models + - [WIP: IETF SCITT: Use Case: OpenSSF Metrics: activitypub extensions for security.txt](https://github.com/ietf-scitt/use-cases/blob/bcecb48ddebf8d08dd10b24b8061deb46491d0c5/openssf_metrics.md#activitypub-extensions-for-securitytxt) +- TODO + - [ ] https://github.com/anthonyharrison?tab=repositories + - [ ] Check out FOSDEM talks, Siemens SBOMs for vuln management + - [ ] Check out Anthony's SBOM audit + - Checks for valid license, up to date versions, etc. + - [ ] Meet next week \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0187/index.md b/docs/discussions/alice_engineering_comms/0187/index.md new file mode 100644 index 0000000000..d7f8dcb75f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0187/index.md @@ -0,0 +1 @@ +# 2023-02-23 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0187/reply_0000.md b/docs/discussions/alice_engineering_comms/0187/reply_0000.md new file mode 100644 index 0000000000..f5f68aa5da --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0187/reply_0000.md @@ -0,0 +1,37 @@ +## 2023-02-23 @pdxjohnny Engineering Logs + +- https://github.com/cloudfoundry-community/node-cfenv + - https://github.com/TBD54566975/dwn-sdk-js#usage + - https://github.com/TBD54566975/dwn-cli/pull/1/files +- Eventting helps us have Alice sit alongside and look at new issues, workflow runs, etc. This will help her help developers stay away from known bad/unhelpful trains of thought. + - She can look at issue bodies for similar stack traces + - Eventually we'll have the updating like we do where we update issue or discussion thread with what console commands and outputs we run while debugging, or we'll just do peer to peer depending on context! + - #1207 + - [live at HEAD](https://abseil.io/about/releases) is great, but poly repo PR validation will bring us into the *future*, since we'll be running inference over all the active pull requests + - We'll take this further to branches, then to the in progress trains of thought (active debug, states of the art which gatekeeper/umbrella/prioriziter says are active based on overlays for context of scientific exploration) + - As our inference gets better, we'll look across the trains of thought and [`Prohpet.predict()`]() state of the art trains of thought, then validate those via dispatch/distributed compute, then we'll start to just infer the outputs of the distributed compute, and validate based on risk and criticality, we'll then have our best guess muscle memory machine. +- Mermaid has mind map functionality now +- https://www.youtube.com/watch?v=tXJ03mPChYo&t=375s + - Alice helps us understand the security posture of this whole stack over it's lifecycle. She's trying to help us understand the metrics and models produced from analysis of our software and improve it in arbitrary areas (via overlays). She has overlays for dependency analysis and deciding if there is anything she can do to help improve those dependencies. `alice threats` will be where she decides if those changes or the stats mined from shouldi are aligned to her strategic principles, we'll also look to generate threat models based on analysis of dependencies found going down the rabbit hole again with alice shouldi (#596). These threat models can then be improved via running https://github.com/johnlwhiteman/living-threat-models auditor.py `alice threats audit`, threats are inherently strategic, based on deployment context, they require knowledge of the code (static), past behavior (pulled from event stream of distributed compute runs), and understanding of what deployments are relavent for vuln analysis per the threat model. + - Entity, infrastructure (methodology for traversal and chaining), (open) architecture + - What are you running (+deps), where are you running it (overlayed deployment, this is evaluated in federated downstream SCITT for applicablity and reissusance of VEX/VDR by downstream), and what's the upstream threat model telling you if you should care if what your running and how your running it yields unmittigated threats. If so, and Alice knows how to contribute, Alice please contribute. If not and Alice doesn't know how to contribute. Alice please log todos, across org relevant poly repos. + - When we do our depth of field mapping (ref early engineering log streams) we'll merge all the event stream analysis via the tuned brute force prioritizer (grep alice discussion arch) +- Loosly coupled DID VC CI/CD enables AI in the loop development in a decentralized poly repo environment (Open Source Software cross orgs) +- TODO + - [ ] Docs + - [ ] How to do async collaboration + - [x] How to do code review + - #1313 + - [ ] Versioned Learning to help communicate best practices and understand how well our trains of thought are preforming (intent alignment, strategic plans and principles alignment, behavioral/goal alignment for different tasks like during debug or pursuing some new plan goal while modifying software DNA/arch/LTM) + - [All You Need Is Supervised Learning: From Imitation Learning to Meta-RL With Upside Down RL](https://arxiv.org/abs/2202.11960) + - [Multi-agent versioned learning](https://github.com/jetnew/SlimeRL) + - [DFFML Manifest Schema ADR](https://github.com/intel/dffml/blob/alice/docs/arch/0010-Schema.rst) + - [ ] https://intel.github.io/dffml/main/examples/webhook/webhook.html#webhook-dataflow + - kcp -> k8s -> cf push -> webhook service -> dataflow to create activitypub event -> dwn-cli send -> webrtc -> dwn-cli recv -> `alice threats listen activitypub -stdin` -> `alice shouldi contribute` -> `alice please contribute` -> soft-serve/github repo pull request -> webhook service + - https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&t=2064s + - https://www.youtube.com/watch?v=THKMfJpPt8I&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&t=128s + - https://github.com/charmbracelet/soft-serve + - https://github.com/cloudfoundry/korifi/blob/main/HACKING.md#deploying-to-kind-for-remote-debugging-with-a-locally-deployed-container-registry + - https://github.com/cloudfoundry/korifi/releases/tag/v0.6.0 + - Could we do an ingress that's source is DWN ActivityPub events? + - [ ] NVD API -> ActivityPub \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0188/index.md b/docs/discussions/alice_engineering_comms/0188/index.md new file mode 100644 index 0000000000..72fe4631ca --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0188/index.md @@ -0,0 +1 @@ +# 2023-02-24 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0188/reply_0000.md b/docs/discussions/alice_engineering_comms/0188/reply_0000.md new file mode 100644 index 0000000000..5a9f2110d3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0188/reply_0000.md @@ -0,0 +1,62 @@ +## 2023-02-24 @pdxjohnny Engineering Logs + +- Something about the pinning #906 + - [Rolling Alice: Architecting Alice: Introduction and Context](https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#rolling-alice-volume-0-introduction-and-context) + - Together we'll build Alice the AI software architect. We'll be successful when Alice successfully maintains a codebase as the only maintainer for a year. *Debugging issues, writing fixes, reviewing code, accepting pull requests, refactoring the code base post PR merge, dealing with vulnerabilities, cutting releases, maintaining release branches, and completing development work in alignment with the plugin's living threat model* (leveraging the [Open Architecture](https://github.com/intel/dffml/blob/alice/docs/arch/0009-Open-Architecture.rst)). *She will modify, submit pull requests to, and track upstreaming of patches to her dependencies to achieve the cleanest architecture possible.* We'll interact with her as we would any other remote developer. + - Fork + - Work + - PR to upstream with pin + - #1061 this is the change of manifest + - Does it adhear to THREATS.md straregic plans and principles? Ship it! (auto merge PR) +- Want feedback on your PRs? (or in flight dev trains of thought, ) + - Publish to activitypub! + - In SSI fediverse, 2nd party feedback finds YOU! + - https://github.com/pdxjohnny/pdxjohnny.github.io/issues/2 + - OpenVEX + - #1061 + - https://github.com/LAION-AI/Open-Assistant/pull/1483/files#r1117649911 + - Reached out to this community again since we know about them already + - We've been playing with ActivityPub as one option to enable multiple workers to provide feedback via `inReplyTo` and threads, mimicking human behavior. Wasn't sure where else to post so posting here. The hope is that our models can collectively respond, and the user or users AI agent can sift through and find the responses that are most helpful to them within the context of the conversation. Something like ActivityPub based communication (Rapunzel, ATProto come to mind) would enable folks AI's to collaboratively provide their responses. + - References + - [WIP: IETF SCITT: Use Case: OpenSSF Metrics: activitypub extensions for security.txt](https://github.com/ietf-scitt/use-cases/blob/fd2828090482fe63a30a7ddd9e91bdb78892a01e/openssf_metrics.md#activitypub-extensions-for-securitytxt) + - [2023-02-15 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4983602) `job_url` -> GitHub API -> active PRs for commit +- Manifests assist with checkpoint and restore SLSA 4 + - TDX live migration + - KERI watchers are all you need (they themselves are a "SCITT instance") + - Thank you Ned! 🥳🥳🥳 + - Should have just asked him this explicitly months ago... + - https://identity.foundation/keri/did_methods/#key-event-receipt-log + - https://github.com/decentralized-identity/keri/blob/master/kids/kid0009.md + - https://github.com/WebOfTrust/keripy/blob/development/tests/app/test_watching.py + - https://github.com/WebOfTrust/keria/blob/main/tests/core/test_authing.py + - https://github.com/WebOfTrust/keripy/blob/development/src/keri/demo/demo.md + +```diff +diff --git a/entities/alice/entry_points.txt b/entities/alice/entry_points.txt +index 49426b5..9277df0 100644 +--- a/entities/alice/entry_points.txt ++++ b/entities/alice/entry_points.txt +@@ -30,6 +30,7 @@ OverlayActionsValidator = dffml_operations_innersource.ac + OverlayNPMGroovyLint = dffml_operations_innersource.npm_groovy_lint:npm_groovy_lint + OverlayNPMGroovyLintStartCodeNarcServer = dffml_operations_innersource.npm_groovy_lint:start_code_narc_server + OverlayNPMGroovyLintStopCodeNarcServer = dffml_operations_innersource.npm_groovy_lint:stop_code_narc_server ++OverlayRecommendedCommunityStandards = alice.please.log.todos.todos:AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues + + [dffml.overlays.alice.please.log.todos] + OverlayCLI = alice.please.log.todos.todos:OverlayCLI +``` + +- https://github.com/intel/dffml/issues/1394 +- **HUZZAH!** IT WORKED! +- https://github.com/intel/dffml/issues/1440 +- TODO + - [ ] [Rolling Alice: Coach Alice: Strategic Principles as Game Plan](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0003_strategic_principles_as_game_plan.md) + - https://github.com/issues?q=is%3Aopen+is%3Aissue+archived%3Afalse+sort%3Arelevance-desc+repo%3Aintel%2Fdffml+author%3Aaliceoa + - https://github.com/TomWright/mermaid-server + - For static dumps + - [ ] `alice please log todos` overlays enabled on `alice shouldi contribute` for feedback + - [ ] Dataflow output where `Input.value` becomes the operation name (grep recent logs) + - [ ] `alice please log todos` as overlay + - See diff, stuck on https://github.com/intel/dffml/issues/1394 + - [ ] Talk to Ryan per recent CVE Bin Tool meeting notes + - Overlays for please contribute https://github.com/ossf/scorecard-action if badge not found \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0189/index.md b/docs/discussions/alice_engineering_comms/0189/index.md new file mode 100644 index 0000000000..cdf29e2563 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0189/index.md @@ -0,0 +1 @@ +# 2023-02-25 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0189/reply_0000.md b/docs/discussions/alice_engineering_comms/0189/reply_0000.md new file mode 100644 index 0000000000..177f69af45 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0189/reply_0000.md @@ -0,0 +1,226 @@ +## 2023-02-25 @pdxjohnny Engineering Logs + +- https://github.com/WebOfTrustInfo/rwot11-the-hague/compare/master...Klingefjord:rwot11-the-hague:master + - https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/advance-readings/Multi-dimensional%20reputation%20systems%20using%20webs-of-trust.md +- https://github.com/cli/cli/blob/trunk/docs/install_linux.md#fedora-centos-red-hat-enterprise-linux-dnf + +```console +$ sudo dnf install 'dnf-command(config-manager)' +$ sudo dnf config-manager --add-repo https://cli.github.com/packages/rpm/gh-cli.repo +$ sudo dnf install gh +``` + +- https://developer.hashicorp.com/packer/downloads + +```console +$ wget -O- https://apt.releases.hashicorp.com/gpg | gpg --dearmor | sudo tee /usr/share/keyrings/hashicorp-archive-keyring.gpg +$ echo "deb [signed-by=/usr/share/keyrings/hashicorp-archive-keyring.gpg] https://apt.releases.hashicorp.com $(lsb_release -cs) main" | sudo tee /etc/apt/sources.list.d/hashicorp.list +$ sudo apt update && sudo apt install packer +``` + +```console +$ sudo dnf install -y dnf-plugins-core +$ sudo dnf config-manager --add-repo https://rpm.releases.hashicorp.com/fedora/hashicorp.repo +$ sudo dnf -y install packer +``` + +- kcp -> k8s -> cf push -> webhook service -> dataflow to create activitypub event -> dwn-cli send -> webrtc -> dwn-cli recv -> alice threats listen activitypub -stdin -> alice shouldi contribute -> alice please contribute -> soft-serve/github repo pull request -> webhook service +- https://docs.docker.com/engine/install/fedora/ + +```console +$ sudo dnf -y install dnf-plugins-core +$ sudo dnf config-manager \ + --add-repo \ + https://download.docker.com/linux/fedora/docker-ce.repo +$ sudo dnf install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin +$ sudo systemctl enable --now docker +``` + +- https://kind.sigs.k8s.io/docs/user/quick-start/ + +```console +$ curl -Lo ./kind https://kind.sigs.k8s.io/dl/v0.17.0/kind-linux-amd64 +$ chmod +x ./kind +$ sudo mv ./kind /usr/local/bin/kind +``` + +- https://github.com/cloudfoundry/korifi/blob/main/INSTALL.kind.md + +```console +$ ROOT_NAMESPACE="cf" +$ KORIFI_NAMESPACE="korifi-system" +$ ADMIN_USERNAME="kubernetes-admin" +$ BASE_DOMAIN="apps-127-0-0-1.nip.io" +$ cat < { + + switch (body.type) { + case "Follow": { +- await send(actor, body.actor, { +- "@context": "https://www.w3.org/ns/activitystreams", +- id: uri, +- type: "Accept", +- actor, +- object: body, +- }); ++ try { ++ await send(actor, body.actor, { ++ "@context": "https://www.w3.org/ns/activitystreams", ++ id: uri, ++ type: "Accept", ++ actor, ++ object: body, ++ }); ++ } catch (err) { ++ console.error(err); ++ return res.sendStatus(401); ++ } + + createFollower({ actor: body.actor, uri: body.id }); + break; +diff --git a/src/admin.ts b/src/admin.ts +index d36be9e..55a00ff 100644 +--- a/src/admin.ts ++++ b/src/admin.ts +@@ -22,6 +22,7 @@ if (ADMIN_USERNAME && ADMIN_PASSWORD) { + } + + admin.post("/create", async (req, res) => { ++ try { + const actor: string = req.app.get("actor"); + + const create = type({ object: omit(Object, ["id"]) }); +@@ -59,6 +60,10 @@ admin.post("/create", async (req, res) => { + } + + return res.sendStatus(204); ++ } catch (err) { ++ console.error(err); ++ return res.sendStatus(500); ++ } + }); + + admin.post("/follow/:actor/:hostname/:port/:proto", async (req, res) => { +@@ -69,13 +69,19 @@ admin.post("/follow/:actor/:hostname/:port/:proto", async (req, res) => { + })(req.params); + const endpoint: string = (FDQN != null ? FDQN: `${HOSTNAME}:${PORT}`); + const uri = `${PROTO}://${endpoint}/@${crypto.randomUUID()}`; +- await send(actor, object, { +- "@context": "https://www.w3.org/ns/activitystreams", +- id: uri, +- type: "Follow", +- actor, +- object, +- }); ++ try { ++ await send(actor, object, { ++ "@context": "https://www.w3.org/ns/activitystreams", ++ id: uri, ++ type: "Follow", ++ actor, ++ object, ++ }); ++ } catch (err) { ++ console.error(err); ++ res.sendStatus(500); ++ return; ++ } + + createFollowing({ actor: object, uri }); + res.sendStatus(204); +@@ -88,18 +94,23 @@ admin.delete("/follow/:actor", async (req, res) => { + const following = getFollowing(object); + if (!following) return res.sendStatus(204); + +- await send(actor, object, { +- "@context": "https://www.w3.org/ns/activitystreams", +- id: following.uri + "/undo", +- type: "Undo", +- actor: actor, +- object: { +- id: following.uri, +- type: "Follow", +- actor, +- object, +- }, +- }); ++ try { ++ await send(actor, object, { ++ "@context": "https://www.w3.org/ns/activitystreams", ++ id: following.uri + "/undo", ++ type: "Undo", ++ actor: actor, ++ object: { ++ id: following.uri, ++ type: "Follow", ++ actor, ++ object, ++ }, ++ }); ++ } catch (err) { ++ console.error(err); ++ return res.sendStatus(500); ++ } + + deleteFollowing({ actor: object, uri: following.uri }); + return res.sendStatus(204); +``` + +``` +src/admin.ts:64:53 - error TS7030: Not all code paths return a value. + +64 admin.post("/follow/:actor/:hostname/:port/:proto", async (req, res) => { +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0190/index.md b/docs/discussions/alice_engineering_comms/0190/index.md new file mode 100644 index 0000000000..e31ac2de5e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0190/index.md @@ -0,0 +1 @@ +# 2023-02-26 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0190/reply_0000.md b/docs/discussions/alice_engineering_comms/0190/reply_0000.md new file mode 100644 index 0000000000..fc361e95e3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0190/reply_0000.md @@ -0,0 +1,8 @@ +- https://github.com/facebookresearch/labgraph/blob/main/docs/cthulhu.md#streams +- https://www.sciencealert.com/all-living-cells-could-have-the-molecular-machinery-for-a-sixth-sense +- https://www.freethink.com/science/infrared-trpv1-neuron-control + - When we realized this was possible back in April 2022 was when we realized we have to go all in on train of thought hardening +- https://youtu.be/PEVVRkd-wPM + - Securing bare metal at scale +- This needs promise style error handling, then result err + - https://github.com/jakelazaroff/activitypub-starter-kit/blob/fcd5942485d86a66913c5554f85ae905785504e0/src/admin.ts#L54 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0191/index.md b/docs/discussions/alice_engineering_comms/0191/index.md new file mode 100644 index 0000000000..1a8ee0f476 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0191/index.md @@ -0,0 +1 @@ +# 2023-02-27 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0191/reply_0000.md b/docs/discussions/alice_engineering_comms/0191/reply_0000.md new file mode 100644 index 0000000000..35695adc28 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0191/reply_0000.md @@ -0,0 +1,73 @@ +## 2023-02-27 @pdxjohnny Engineering Logs + +- Where are we + - Event + - ActivityPub + - Query (jq + cypher) + - Validation (schema enum) + - Alice or other for extra data mining if not in graph. DFFML is just a Python onramp/offramp helper. + - Event +- https://github.com/probot/probot/blob/master/docs/deployment.md + - This watches the ActivityPub group + - This enables misc bots within the org to provide real time cross-repo feedback + - Bot that knows a lot about one repo can provide upgrade path help as users work through issues updating in downstream repos + - Bot is just the policy as code within the upstream, which says, how to help downstream in these situations + - Example: + - Upstream + - https://github.com/behaviorbot/request-info + - Overlay + - If issue created by `alice please log todos` + - Orchestrator + - GitHub Actions + - Trigger flow + - Tertiary OSS -> activitypub extensions for security.txt -> ActivityPub Actor -> ActivityPub Follow ActivityPub Actor Watcher +- https://github.com/probot/smee-client + - Same as wait for message, only more similar to our setup, we want to make the protocol between this and it's server ActivityPub, so that it's a defined spec and we can traverse and import export from the graph + - We want data security to be handled at the graph level + - Data propagation can also be handled at that level + - Ref SCITT use case + - Policy as code, who my why this message should be propagated +- We're adding the extra layer of ActivityPub so that we can stay loosly coupled + - Focus is on modifying (adding more links / layers) and querying data in graph + - On/ramp off ramp to web2 land + - GitHub Accounts/Apps which watch the graph event stream and decide if they want to take data given as `Input(value=graph_node_content_resolved_from_registry, definition=manifest schema link from content field of node this input is inReplyTo)` and turn it into a pull request + - An operation + - opt-in, heterogeneous, poly repo +- https://github.com/ietf-scitt/statements-by-reference/pull/1/files +- https://github.com/ietf-wg-scitt/draft-ietf-scitt-architecture/issues/12 +- https://github.com/ietf-wg-scitt/draft-ietf-scitt-architecture/issues/11 +- https://github.com/in-toto/demo/blob/main/owner_alice/create_layout.py +- https://techcommunity.microsoft.com/t5/azure-confidential-computing/developers-guide-to-gramine-open-source-lib-os-for-running/ba-p/3645841 +- https://gramine.readthedocs.io/en/stable/quickstart.html +- https://gramine.readthedocs.io/projects/gsc/en/latest/#configuration + - Distro of `ubuntu:20.04` to mono base image for actions runner? + - https://hub.docker.com/_/mono/ +- `print(user.__pydantic_model__.schema())` + - For auto schema creation from data model + - https://docs.pydantic.dev/usage/dataclasses/ +- For our rolling stages we could just take all the output operations, + copy the dataflow with just them, make them processing stage, and run + them as a subflow. Right now we only iterate once, aka one execution + loop for the output operations, they aren't chainable. + - With this rolling stage approach we can easily cypher query over the previous stage + - We could also explore within stage (operation) cypher query over the + JSONLD/manifest synthesis to the graph from cache save/load, or it's in memory form. + - #1388 + - Related: Gatekeeper/Prirotizer + +https://github.com/intel/dffml/blob/1d071ea82af93a15b6559639f223c64b7f356bf6/dffml/df/memory.py#L1976-L1979 + +- Fundamentally Alice is helping us with dependency review, that promotion, cross of trust boundary, from 3rd to 2nd party + - She helps us decide if they are up to the level of requirements we have for running within the 2nd party https://en.wikipedia.org/wiki/Protection_ring, but since we're and open source project, the ring we are protecting is related to the downstream threat model + - Tie back in with our recent CVE Bin Tool meeting + - The protection rings in this context are a level of riskyness this system context (the distro, ML distro in DFFML case) exposes you to + - Cartographer extrodinare +- https://github.com/intel/dffml/issues/1418 + - Updated with reference to activitypub security.txt +- Dataflows produce clean deltas (commits) + - Beyond the unit of the line as granularity for change + - Application of overlay tells you the code change on upstream (like for backporting) +- TODO + - [ ] Read Roy and Steve's doc + - [x] Schedule meeting with Sam + - KERI Watchers as SCITT \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0191/reply_0001.md b/docs/discussions/alice_engineering_comms/0191/reply_0001.md new file mode 100644 index 0000000000..3a7544b0ed --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0191/reply_0001.md @@ -0,0 +1,45 @@ + ## 2023-02-27 SCITT + +- Moved architecture repository to https://github.com/ietf-wg-scitt/draft-ietf-scitt-architecture +- CODEOWNERS (teams usage do not provide traceability, explicitly managed within files) +- Transparency service, TS, registry, all same words, cleaned up a bit + - Transparency Service is what we are going with + - https://github.com/ietf-wg-scitt/draft-ietf-scitt-architecture/pull/16 + - Converge claim and statement +- Example + - Steve is submitting information about an ubuntu release to the TS + - Steve's identity used to write to TS (RBAC auth on this) + - eNotary is validating Canonical's Ubuntu Signature +- The duty of the transparency service is not to detect lies + - It does have some registration process + - There can be a gatekeeper function which limits use of transparency service + - This is the policy which constrains what the notary can place in the service + - Minimum could be content type, what is allowed to write to the product line + - This policy engine is outside the scope of SCITT + - This is what facilitates trust propagation in our recursive grafted log setup + - https://github.com/intel/cve-bin-tool/issues/2639 + - This could just be an admit + - Could be based on RBAC of the product + - #1400 +- Notary maintaining append only ledger + - Notary says identity of signature is correct + - The notary check the identity was valid when signature happens + - It's not the notaries job to say it's an incorrect contract + - It doesn't check that the SBOM is an accurate representation of the software + - Is digital signature valid? + - TS can decide if it wants to accept different identities or types of identities + - Policy on given instance could say + - X509 notarizations for these content types + - Was it revoked? +- Receipt generator + - KERI Watcher + - https://github.com/microsoft/scitt-ccf-ledger + - SCITT Emulator + - https://transparency.dev/ + - Centralized, SigStore +- What happens when hardware fails? How does new logs come online +- John: Recursive downstream policy for VEX + - Roy: There is an issue with propagation times + - John: It sounds like you have insights on the VEX use case. Is there somewhere I can find more details about what you're looking at for that propagation issue? +- [Roy](roywill@microsoft.com) + - Says there is some propagation delay we need to deal with to propagate VEX \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0192/index.md b/docs/discussions/alice_engineering_comms/0192/index.md new file mode 100644 index 0000000000..35d4cb76c7 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0192/index.md @@ -0,0 +1 @@ +# 2023-02-28 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0192/reply_0000.md b/docs/discussions/alice_engineering_comms/0192/reply_0000.md new file mode 100644 index 0000000000..ce1c300b87 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0192/reply_0000.md @@ -0,0 +1,57 @@ +## 2023-02-28 @pdxjohnny Engineering Logs + +- https://github.com/anthonyharrison/sbom4python + - For down the dependency rabbit hole again +- Down the dependency rabbit hole again as Dockerfile FROM extractor as asynciter to an output operation which returns a dataflow with all found dependencies (container image URI) as Inputs in seed + - Another output operation which outputs the set/combo if needed of schema, `jq`, cypher, or open policy agent to yield true on evaluation of an incoming `vcs.push` (as schema URL with format name and version). This will be facilitating our kontain.me source only rebuild triggers. + - Mock the push events by curl instead of websocket publishing to AcivityPub to test + - This is the same way one would implement a pooling based proxy from web2 +- https://github.com/facebookresearch/faiss/wiki/Getting-started +- https://github.com/facebookresearch/faiss/wiki/Running-on-GPUs +- https://github.com/facebookresearch/faiss/wiki/Index-IO,-cloning-and-hyper-parameter-tuning#example-usage +- Software DNA (in part based on the FROM image builds, the open architecture description, our methodology for traversal of the graph) encoded to vector representation (some encoding that yields similar images for similar aspects of the software lifecycle focused on). +- The wait for message on ActivityPub will enable our poly repo merge queue +- https://github.com/w3c-ccg/vc-api +- https://www.intel.com/content/www/us/en/developer/articles/technical/software-bills-of-materials-the-basics.html +- https://github.com/transmute-industries/example-mapping-from-jwt-to-jsonld +- ACDC is a way to secure a Credential + - https://github.com/w3c/vc-data-model/issues/895#issuecomment-1434609248 + - https://github.com/w3c/vc-jwt/pull/56 + - https://github.com/w3c/vc-data-model/issues/947#issuecomment-1434506542 + - This transcript is important, see Orie's concerns about security. jsonld, nquads + - https://github.com/ietf-scitt/statements-by-reference/pull/1 +- https://github.com/libp2p/js-libp2p-websockets +- https://github.com/libp2p/js-libp2p-interfaces +- https://w3c.github.io/wot-scripting-api/#discovery-examples +- https://w3c.github.io/wot-scripting-api/#the-emitpropertychange-method +- https://www.chromium.org/teams/web-capabilities-fugu/ +- https://github.com/gojue/ecapture +- For vsc.push source container proxy repackage (upstream into kontain.me) + +```console +$ cd $(mktemp -d) +$ curl -L -H "Authorization: token $(grep oauth_token < ~/.config/gh/hosts.yml | sed -e 's/ oauth_token: //g')" -H "Accept:application/vnd.github.v3.raw" https://api.github.com/repos/intel/dffml/tarball/master | tar xvz +$ echo -e "FROM scratch\nCOPY ./$(ls) /src" > Dockerfile +$ docker build -t registry.example.org/dffml -f Dockerfile . +$ docker save registry.example.org/dffml | tar --extract --to-stdout --wildcards --no-anchored 'layer.tar' | tar --extract +``` + +- https://marquezproject.ai/quickstart + - ActivityPub -> OpenLinage +- [RFCv2: IETF SCITT: Use Case: OpenSSF Metrics: activitypub extensions for security.txt](https://github.com/ietf-scitt/use-cases/blob/22afd537180d6c6b2d5ec4db0096f0706cb2b6bc/openssf_metrics.md) + - It's basically a decentralized pubsub event notification methodology that can be done over ACDC piggybacking on ActivityPub as layer 7. + - Event data lives "off chain" in a container registry secured via existing transparency service based methods (KERI, SCITT, SigStore), where the chain is the network of keys involved for a train of thoughts comms between entities. Since we transmit ActivityPub over KERI, the graph of our supply chain data we are sharing can be shared with trusted actors who agree not to be duplicitous, and who's KERI keys can be tied back to TEEs so that we can confirm they are running software that doesn't intend (via ML-based, Alice, analysis) to be duplicitous. We can now have our trusted computing based for decentralized compute, aka CI/CD pipelines delivering across project trust boundries. + - Duplicity detection is a MUST have + - Transparency services are just audit trails without this + - DNS example from Sam: Multiple CAs can issue for the same domain. https://henkvancann.github.io/identifiers/keri-oobi.html + - Revocation + - OCSP Stapling + - We add in the ActivityPub `Note`s and statues + - https://database.guide/what-is-acid-in-databases/ +- https://docs.github.com/en/communities/using-templates-to-encourage-useful-issues-and-pull-requests/creating-a-pull-request-template-for-your-repository + +![image](https://user-images.githubusercontent.com/5950433/222050628-40aadba8-8fc3-4d33-8603-f6391b37a7ad.png) + +- https://github.com/decentralized-identity/keri/blob/master/kids/kid0001Comment.md#keri-message-parsing +- https://henkvancann.github.io/identifiers/cesr-one-of-sam-smiths-inventions-is-as-controversial-as-genius.html +- https://henkvancann.github.io/identifiers/cesr-proof-signatures-are-the-segwit-of-authentic-data-in-keri.html \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0192/reply_0001.md b/docs/discussions/alice_engineering_comms/0192/reply_0001.md new file mode 100644 index 0000000000..1b9ec9bad4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0192/reply_0001.md @@ -0,0 +1,55 @@ +## 2023-02-28 KERI and IETF SCITT + +- 1:1 Sam/John +- Did method determines trust basis + - With KERI, it's an AID, helps facilitate a level of security +- DID spec is name space : method specifics : ... +- Administrative trust basis +- Did web uses HTTP as trust basis +- Some are blockchain based +- KERI is append only event log for trust basis +- DICE has compatible trust basis + - Cryptographically derived used the name mechanism as the identifier + - DICE doesn't do pre-rotation + - Would have to reprovision mcu if you need to rotate keys +- Reputation associated with keys + - If rotated, how do you determine who's the same + - Add to TS (KERI) +- DUplicity evident + - Beyond tamper evident: send data send digest + - Key state with KERI is not just tamper evident but duplicaty evident + - If I do a pre-protations I can create two differnt rotation events that went to two different keys + - Can't do a fork without declaring it's a fork + - Verifier says I can apprate the trust basis because I can look at the event log and if you're being duplicitas + - Can't send an NFT to multiple people (title), one of them can't be write, there is duplicity here, have to sign and publish if I'm going to anchor it to my key event log. + - Watcher network allows verifiers to see duplicity + - Reach of watcher network they care about is the breadth of their ecosystem + - If there is a three party transaction, duplicity + - If bob and sue want to see if Alice is being duplicitace, then bob checks with sue to see what Alice's key state is +- Digital signature acts around the world hold legal recourse +- Verifier always checks with watcher, if watcher says it's good then we trust +- Witnesses enable the controller of the identifier to increase the stregth of their control +- We need to trust that watchers won't deleete proof of duplicatity + - I need 1 honest watcher, the only thing dishonest ones watchers can do it delete, theey can manufacture proof of duplicity + - Will can says has Alice been duplicitus? Will can lie and say no, Alice would have to prevent there being one honest Will, her duplicity will be envident +- With DID we would need to evaluate the security of the method before I use it +- If a codebase had multiple DID methods used to contribute to it + - The security is the security of the weakest did method +- We need to go KERI route to prevent duplicity in the supply chain +- Could publish ACDC for the VEX publishing + - VC meant verifiable claim originally, now it's credential + - Can anchor in key state log anything I want, OpenVEX, etc. + - Anyone can make an assertion + - Already setup to be able to do this +- VC went hard on RDF interop +- Allows containers to carry VC of any form, so long as they have a transformer that will produce (one way) JSONLD RDF compliant representation. + - Content type +ld means jsonld, means `@context` + - What is a MUST you must be able to transform into something with an `@context`, so we can do AcitivtyPub+registry -> JSONLD +- KERI is agnosic about name spacing, if you only wanted to use `did:keri`, then you would skip using the DID method namespace and just use the KERI stuff +- Next steps + - ACDC transmission of records, setup watchers +- DID provides namespacing + - KERI is the highest protection ring currently available + - Using DID methods where there is no duplicity detection means there are non-context local events which could alter the validity of key state (Example: Keys stored in blockchain, a key signing two things, one of which it wasn't supposed to sign. + - KERI solves the distributed locking problem needed to operationalize #51 across address spaces + - #772 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0193/index.md b/docs/discussions/alice_engineering_comms/0193/index.md new file mode 100644 index 0000000000..d81f842007 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0193/index.md @@ -0,0 +1,10 @@ +# 2023-03-01 Engineering Logs + +- #363 + - https://stefanbuck.com/blog/codeless-contributions-with-github-issue-forms + - Everything as a custom form app has truly arrived + - 🛤️🛤️🛤️ + - Mermaid --> dataflow --> function which takes arch/dataflow and synthesizes to -> ActivityPub|dispatch + - We enable decoupled interaction between pipelines and issue/ML/entity ops by rebroadcasting into the linked data space + +![chaos-for-the-chaos-god](https://user-images.githubusercontent.com/5950433/220794351-4611804a-ac72-47aa-8954-cdb3c10d6a5b.jpg) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0193/reply_0000.md b/docs/discussions/alice_engineering_comms/0193/reply_0000.md new file mode 100644 index 0000000000..b310221e9d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0193/reply_0000.md @@ -0,0 +1,29 @@ +## 2023-03-01 @pdxjohnny Engineering Logs + +- https://github.com/stefanbuck/github-issue-parser +- https://codeberg.org/openEngiadina/geopub#semantic-social-network +- OpenFL integration for ActivitSecurity txt event stream engest +- https://socialhub.activitypub.rocks/t/standardizing-on-activitypub-groups/1984 + - We can aggregate data from individual push accounts deployed into a group which puts that data under the correct thread for it's schema. + +```console +$ while [ ! -f stop ]; do FDQN=vcs.activitypub.securitytxt.dffml.chadig.com WEBHOOK_PATH=$(cat ../webhook) NODE_ENV=production PORT=8000 ACCOUNT=push ADMIN_USERNAME=admin ADMIN_PASSWORD=$(cat ../password) PUBLIC_KEY=$(cat publickey.crt) PRIVATE_KEY=$(cat pkcs8.key) npm run start; done +``` + +- GitHub Issue based fork + exec + - Add YAML manifests for overlays + +```console +$ echo -e "### We created a new plugin, the GitHub repo is\nhttps://github.com/dffml/dffml-model-transformers" | gh issue create -R https://github.com/intel/dffml --title "plugin: new: dffml-model-transformers" --body-file /dev/stdin +$ jq -r -n 'env.BUILD_ARGS' | jq '. |= . + [["APPEND", env.APPEND]]' +``` + +- TODO + - [ ] #1061 + - [ ] Model transformers downstream watcher + - [ ] ramfs to limit sqlite + - [ ] systemd unit files to start + - [x] Meet with Yash + - [x] https://github.com/jakelazaroff/activitypub-starter-kit/blob/fcd5942485d86a66913c5554f85ae905785504e0/src/admin.ts#L54 + - [e642b406f68f747586a05ed07f9fc247ed6c02e8](https://github.com/jakelazaroff/activitypub-starter-kit/commit/e642b406f68f747586a05ed07f9fc247ed6c02e8) + - [ ] https://github.com/actions/runner/issues/2417 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0193/reply_0001.md b/docs/discussions/alice_engineering_comms/0193/reply_0001.md new file mode 100644 index 0000000000..feadb91564 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0193/reply_0001.md @@ -0,0 +1,27 @@ +## 2023-03-01 CVE Bin Tool Monthly + +- If you want to be a mentor please ping Terri +- https://github.com/intel/cve-bin-tool/issues?q=is%3Aissue+is%3Aopen+label%3Agsoc +- https://github.com/intel/cve-bin-tool/issues/2633 +- https://blogs.python-gsoc.org/en/ +- https://github.com/intel/cve-bin-tool/issues/2756 +- Dependabot issues are from tests +- Ideally we'd get CVE Bin Tool to be considered equivlant, there are more features for triage and exclusion rules + - Ideally we work with dependabot to align formats + - https://github.com/intel/cve-bin-tool/issues/2639 + +![image](https://user-images.githubusercontent.com/5950433/222214226-0091a5f9-4d10-4882-bbcf-6068503f23bc.png) + +- Anthony was at FOSDEM + - SW360 seems to be moving in a similar direction + - No one is quite as mature as cve-bin-tool at handling all the SBOM types + - Anthony sees maturing the triage process as a high value area, especially for GSOC + - CycloneDX moving faster format spec iteration wise + - Some nice features on their roadmap + - Issues in terms of identifying products + - Mapping naming of products to releases is an ongoing issue most people struggle with + - Ideally we start all using PURL to help start identifying the right products and versions. + - Major healthcare providers understand there will be some vulns on release + - Threat model can help us understand if they matter to deployment + - [THREATS.md](https://github.com/johnlwhiteman/living-threat-models) + - [2023-02-22 CVE Bin Tool Monthly Meeting](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-5079592) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0194/index.md b/docs/discussions/alice_engineering_comms/0194/index.md new file mode 100644 index 0000000000..55bb976dec --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0194/index.md @@ -0,0 +1,11 @@ +# 2023-03-02 Engineering Logs + +- https://a.exozy.me/posts/activitypub-eats-your-brain/ +- https://www.microsoft.com/en-us/security/blog/2022/11/16/microsoft-contributes-s2c2f-to-openssf-to-improve-supply-chain-security/ + - https://github.com/ossf/s2c2f/blob/main/specification/framework.md#secure-supply-chain-consumption-framework-levels-of-maturity + - https://github.com/ossf/s2c2f/blob/main/specification/framework.md#secure-supply-chain-consumption-framework-requirements + - We want to bake level 4 into the lifecycle aka have Alice help get folks there +- https://github.com/notaryproject/notaryproject + - https://github.com/notaryproject/notaryproject/blob/main/requirements/scenarios.md + - Aligned + - > ![notaryproject-oss-project-sequence](https://github.com/notaryproject/notaryproject/raw/main/media/oss-project-sequence.svg) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0194/reply_0000.md b/docs/discussions/alice_engineering_comms/0194/reply_0000.md new file mode 100644 index 0000000000..ae174e8800 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0194/reply_0000.md @@ -0,0 +1,311 @@ +## 2023-03-02 @pdxjohnny Engineering Logs + +- Some example execution protection rings + - System Management Mode + - Root + - Userspace + - Sandboxed (v8) +- Wardly maps of hardware security strength as a commodity + - TPM (most widely deployed) + - https://www.tomsguide.com/news/billions-of-pcs-and-other-devices-vulnerable-to-newly-discovered-tpm-20-flaws + - TXT + - We frequently skip talking about this in this thread to avoid too much acronym soup, but TPMs are only good for https://github.com/intel/dffml/tree/alice/docs/arch/0007-A-GitHub-Public-Bey-and-TPM-Based-Supply-Chain-Security-Mitigation-Option.rst, aka tying keys into known hardware, without TXT (at least as we've been talking about them here). We just usually either talk about TPMs or TDX in this thread to illustrate the ends of the spectrum. + - Trusted eXecution exTensions and Boot Guard combined with also a TPM (or a virtual equivalent) enables attested compute (by way of Secure Boot) + - https://edk2-docs.gitbook.io/understanding-the-uefi-secure-boot-chain/secure_boot_chain_in_uefi/intel_boot_guard + - https://www.chromium.org/developers/design-documents/tpm-usage/#attesting-device-mode + - > [Attesting TPM-Protected Keys](https://www.chromium.org/developers/design-documents/tpm-usage/#attesting-tpm-protected-keys) +If an RSA private key has been generated in the TPM and has always been non-migratable, then the key may be certified by a key that has been verified as an Attestation Identity Key (AIK). No key, including any AIK, is certified unless the user or device-owner has consented to remote attestation of his or her device. A certified key credential gives very strong assurance that the key is protected by a Chrome Device TPM. + > + > [Attesting Device Mode](https://www.chromium.org/developers/design-documents/tpm-usage/#attesting-device-mode) +At boot time, the read-only firmware extends TPM PCR0 with the status of the developer and recovery mode switches. The value of PCR0 can later be quoted using a key that has been verified as an Attestation Identity Key (AIK). The quote, in combination with the AIK credential, gives assurance that the reported PCR0 value is accurate. While assurance of the PCR0 value is very strong, assurance that this correctly reflects the device mode is weaker because of the reliance on read-only firmware to extend PCR0. It is nonetheless useful for reporting policy compliance. This PCR0 quote is not available outside of Chrome OS unless the user or device-owner has consented to remote attestation of the device. + - SGX + - TDX (least widely deployed) +- https://android.googlesource.com/platform/external/avb/+/master/README.md +- https://developer.android.com/training/articles/security-key-attestation +- https://ci.spdk.io/ + - ActivityPub integration + - https://spdkci.intel.com/job/autotest-spdk-v23.01-LTS-vs-dpdk-main/152/ +- https://berkeley-deep-learning.github.io/cs294-131-s19/ +- https://docs.github.com/en/repositories/configuring-branches-and-merges-in-your-repository/defining-the-mergeability-of-pull-requests/managing-a-branch-protection-rule +- https://github.com/opencontainers/image-spec/blob/main/manifest.md + - Image command sequence to in-toto + - Attestation as build arg + - Still eventually #1426 +- https://docs.github.com/en/actions/using-workflows/triggering-a-workflow#accessing-and-using-event-properties + - Example of bots managing pinning +- Mirror of CI/CD actions can be executed with same manifest instance pattern for increased performance + +```console +$ curl -fL https://vcs.activitypub.securitytxt.dffml.chadig.com/push/outbox/ > outbox@push@vcs.activitypub.securitytxt.dffml.chadig.com +$ jq .orderedItems[].id < outbox\@push\@vcs.activitypub.securitytxt.dffml.chadig.com | wc -l +3931 +$ jq -r '.orderedItems[] | [{(.id): (.object.content)}] | .[] | add' < outbox\@push\@vcs.activitypub.securitytxt.dffml.chadig.com | jq -R --unbuffered '. as $line | try (fromjson | .) catch $line' +$ jq -r '.orderedItems[] | [{(.id): (.object.content)}] | .[] | add' < outbox\@push\@vcs.activitypub.securitytxt.dffml.chadig.com | jq -R --unbuffered '. as $line | try (fromjson | .workflow_job) catch $line' +$ jq -r '.orderedItems[] | [{(.id): (.object.content)}] | .[] | add' < outbox\@push\@vcs.activitypub.securitytxt.dffml.chadig.com | jq -c -R --unbuffered '. as $line | try (fromjson | .workflow_job) catch $line' | jq -s | python3 -c "import sys, pathlib, json, yaml; print(yaml.dump(json.load(sys.stdin)))" +``` + +```yaml +- check_run_url: https://api.github.com/repos/intel/dffml/check-runs/11733499326 + completed_at: '2023-03-03T04:30:59Z' + conclusion: success + created_at: '2023-03-03T03:58:07Z' + head_branch: main + head_sha: 4241b49975cf364b540fc0ad961cde58e2c89623 + html_url: https://github.com/intel/dffml/actions/runs/4320093439/jobs/7539975999 + id: 11733499326 + labels: + - ubuntu-latest + name: test (operations/nlp, 3.7) + node_id: CR_kwDOCOlgGM8AAAACu179vg + run_attempt: 1 + run_id: 4320093439 + run_url: https://api.github.com/repos/intel/dffml/actions/runs/4320093439 + runner_group_id: 2 + runner_group_name: GitHub Actions + runner_id: 16 + runner_name: GitHub Actions 16 + started_at: '2023-03-03T04:26:41Z' + status: completed + steps: + - completed_at: '2023-03-03T04:26:42.000Z' + conclusion: success + name: Set up job + number: 1 + started_at: '2023-03-03T04:26:40.000Z' + status: completed + - completed_at: '2023-03-03T04:30:57.000Z' + conclusion: success + name: Complete job + number: 21 + started_at: '2023-03-03T04:30:57.000Z' + status: completed + url: https://api.github.com/repos/intel/dffml/actions/jobs/11733499326 + workflow_name: Tests +``` + +- https://api.github.com/users/pdxjohnny/received_events + - This looks like good rebroadcast material +- https://www.rabbitmq.com/cli.html +- We want to transform from ActivityPub incoming event (`@context|$schema` from node `inReply(d)To`) into event stream for alternate execution by worker nodes attached to context local message queue. +- Job URL -> hash -> mapping of lookup results from job URL as content address which resolves to results in oras.land + - Just add the job URL hash as a tag and resolve via pulling that tag from the registry +- https://docs.celeryq.dev/en/stable/getting-started/backends-and-brokers/rabbitmq.html +- https://docs.celeryq.dev/en/stable/tutorials/task-cookbook.html +- https://docs.celeryq.dev/en/stable/django/first-steps-with-django.html#using-celery-with-django +- We can enable ActivityPub as a database for celery and then we have parity between GitHub Actions as execution environment for ideation and prototyping compute. Then we have standard protocol and library to manage task queue execution based on inputs as schema/context inReplyTo events. + - We can then run fully decoupled +- https://gvisor.dev/docs/tutorials/knative/ + - Wait we're supposed to be doing KCP almost forgot +- Run some live ones in https://github.com/cloudfoundry/korifi via `dffml-service-http` + - Demo similar job URL hash as registry tag based addressing of results within registry + - Enable sending of AcivityPub events directly (later) or indirectly via proxy nodes (first, activitypub starter kit. +- https://ci.spdk.io/results/autotest-nightly/builds/1935/archive/crypto-autotest/build.log + +```yaml +- completed_at: '2023-03-03T04:30:59Z' + conclusion: success + created_at: '2023-03-03T03:58:07Z' + head_sha: 4241b49975cf364b540fc0ad961cde58e2c89623 + html_url: https://ci.spdk.io.deployed.at.example.com/public_build/autotest-spdk-master-vs-dpdk-main_1754.html + id: 1754 + labels: + - list + - of + - overlays + - on + - dffml.overlays.alice.shouldi.contribute + name: alice.shouldi.contribute + status: completed + steps: + - completed_at: '2023-03-03T04:26:42.000Z' + conclusion: success + name: Run scan + number: 1 + started_at: '2023-03-03T04:26:40.000Z' + status: completed + url: https://vcs.activitypub.securitytxt.dffml.chadig.com/push/posts/40aeeda3-6042-42ed-8e32-99eff9bd8ef4 + workflow_name: Alice Should I Contribute? +``` + +![knowledge-graphs-for-the-knowledge-god](https://user-images.githubusercontent.com/5950433/222981558-0b50593a-c83f-4c6c-9aff-1b553403eac7.png) + +- So no matter where you're executing, all the reporting and eventing is the same, because we are loosely coupled. + - We can do `fromjson` in jq or we can do more advanced xargs chaining on the websocket for ad-hox dev work + - We can shot from the activitypub inbox receiver to a message queue for integration with existing celery + - This way we sidestep all rate limiting except for when we have to preform write events to GitHub + - Otherwise we always read GitHub data from cypher queries over the reboardcast data + - We can also have listeners which reboardcast the resolved contents of content address style broadcast data (the top level, so if this sees a container image uri broadcast, it would be pulling it down and maybe rebroadcasting the `results.yaml` or whatever is they transform needed to rebroadcast that data. + - This is our onramp into the linked data space, eventually KERI for backing comms security +- https://linkeddatafragments.org/ +- http://query.linkeddatafragments.org/#query=&resultsToTree=false&queryFormat=graphql +- https://gist.github.com/rubensworks/9d6eccce996317677d71944ed1087ea6 +- https://github.com/comunica/jQuery-Widget.js/blob/master/config/config-default.json +- We need to turn the stream into something we can query using cypher or graphql-ld +- https://swordapp.github.io/swordv3/swordv3.html +- https://oras.land/blog/gatekeeper-policies-as-oci-image/ +- https://github.com/project-zot/zot +- Okay if we can make the KERI SCITT instance use the OCI upload/download spec and then align the telemetry and registry federation protocols + - Look into existing registry federation protocol if exists +- https://s3hh.wordpress.com/2022/10/27/oci-based-linux/ + - Similar goals to OS DecentrAlice +- https://github.com/project-machine/mos/releases/tag/0.0.7 +- https://github.com/opencontainers/distribution-spec/blob/main/spec.md#endpoints +- https://github.com/opencontainers/distribution-spec/issues/388 + - Have we thought about federation protocols / APIs? To enable registries to propagate uploaded content within a network of registries? Looking to come up to speed on any existing discussion if that's been touched on. Thank you! + - References + - https://github.com/opencontainers/distribution-spec/blob/main/spec.md#endpoints + - Looked here for relevant paths here but not seeing anything that looks like it's for notifications / inbox style eventing + - https://github.com/sapcc/keppel + - https://github.com/ietf-scitt/use-cases/issues/14 + - Hoping we can align to similar federation protocols across transparency services and container registries so event stream consumers can work with the same protocol for each (ActivityStreams/Pub?) +- https://conformance.opencontainers.org/ +- https://vsoch.github.io/django-oci/docs/getting-started/auth +- https://vsoch.github.io/django-oci/docs/getting-started/testing +- https://github.com/opencontainers/distribution-spec/issues/110#issuecomment-708691114 +- https://github.com/sapcc/keppel +- https://github.com/sapcc/keppel/blob/master/docs/api-spec.md#post-keppelv1authpeering + - Looks like they have their own spec for federation, maybe we can implement with ActivityPub? + - Maybe we can leverage the existing APIs similar to the /admin endpoint and just add in the activitypub endpoints for activitystreams / linked data notifications +- https://github.com/sapcc/keppel/blob/master/docs/example-policy.yaml +- We can take one manifest and make it into another one for execution via a different mechanism + - Similar to the CLI overlays + - https://github.com/intel/dffml/blob/c82f7ddd29a00d24217c50370907c281c4b5b54d/entities/alice/alice/please/contribute/recommended_community_standards/cli.py#L60-L72 + - This is also similar to how we can decouple TODO logging from content for `alice please log todos` + - Operation to generate TODO body + - Operation for logging the TODO (write to GitHub) + - Similar to a mutation of the propagated event into something context local relevant + - Yes this vuln affects due to instance policy relevant threat model overlays or not +- https://github.com/opencontainers/image-spec/blob/main/artifact.md +- Manifest for CLI command + +**schema/alice/shouldi/contribute/github-com-omnilib-aiosqlite.json** + +```json +{ + "@context": "https://github.com/intel/dffml/raw/alice/schema/schema/alice/shouldi/contribute/0.0.0.schema.json", + "repo_url": "https://github.com/omnilib/aiosqlite" +} +``` + +- As container build + +**schema/image/container/build/alice-shouldi-contribute-results-github-com-omnilib-aiosqlite.json** + +```json +{ + "@context": "https://github.com/intel/dffml/raw/alice/schema/github/actions/build/images/containers/0.0.0.schema.json", + "include": [ + { + "branch": "alice", + "build_args": "[[\"REPO_URL\", \"https://github.com/omnilib/aiosqlite\"]]", + "commit": "ca92bfae5092bce908b70f6b5e0afbe242ce7a5b", + "dockerfile": "entities/alice/scripts/alice-shouldi-contribute-results.Dockerfile", + "image_name": "alice-shouldi-contribute-results-github-com-omnilib-aiosqlite", + "owner": "intel", + "repository": "dffml" + } + ] +} +``` + +- https://codeberg.org/fediverse/fep +- Open Source scanning flow + - Upload manifest to registry + - Federation event (send to follower /inbox) + - content: `https://github.com/opencontainers/image-spec/raw/v1.0.1/schema/image-manifest-schema.json` + inReplyTo: activitypub extensions for security.txt post URL for content `activitypubsecuritytxt` + - content: container image uri uploaded + inReplyTo: activitypub extensions for security.txt post URL for content `https://github.com/opencontainers/image-spec/raw/v1.0.1/schema/image-manifest-schema.json` + - Downstream listener (aka delve into [config dict](https://intel.github.io/dffml/main/contributing/codebase.html?highlight=config+dict#config)) + - Federation event (send to follower /inbox) + - content: `https://github.com/intel/dffml/raw/alice/schema/github/actions/build/images/containers/0.0.0.schema.json` + inReplyTo: activitypub extensions for security.txt post URL for content `activitypubsecuritytxt` + - content: `` + inReplyTo: activitypub extensions for security.txt post URL for content `https://github.com/intel/dffml/raw/alice/schema/github/actions/build/images/containers/0.0.0.schema.json` + - Downstream listener + - Republish watched `inReplyTo` schema into job/message queue + - RabbitMQ + - Message queue delivers to worker nodes + - Kaniko job waiting for celery queue for image to build + - Exit after rebuild and have orchestration manage respawn + - https://github.com/cloudfoundry/korifi +- https://github.com/opencontainers/distribution-spec/blob/main/extensions/_oci.md + - Could discover federation opportunities via this or security.txt/md valid Actor as URL in file as well +- https://github.com/google/go-containerregistry/tree/d7f8d06c87ed209507dd5f2d723267fe35b38a9f/pkg/v1/remote#structure + - > ![](https://github.com/google/go-containerregistry/raw/d7f8d06c87ed209507dd5f2d723267fe35b38a9f/images/remote.dot.svg) +- https://github.com/opencontainers/image-spec/blob/v1.0.1/manifest.md + - > The third goal is to be [translatable](https://github.com/opencontainers/image-spec/blob/v1.0.1/conversion.md) to the [OCI Runtime Specification](https://github.com/opencontainers/runtime-spec). + - Does this mean we can send to https://aurae.io/quickstart/ ? + - https://github.com/opencontainers/image-spec/blob/v1.0.1/schema/image-manifest-schema.json + - https://opencontainers.org/schema/image/manifest + - https://github.com/aurae-runtime/aurae/blob/3bb6d4c391ec6945436f941299a46c9a83168729/examples/pods-cri-nginx.ts#L57 + - https://github.com/aurae-runtime/aurae/blob/42972181b624a76b6888d1b0079a7f21c34bfb31/api/cri/v1/release-1.26.proto#L1086-L1096 + - https://github.com/aurae-runtime/aurae/commit/47dabf1414678626bd8a432fdf20fdbc6bdf49dc +- https://github.com/intel/dffml/blob/80e773712897a2fa2fb93e6abd4f852302adb79f/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md#checklist +- https://github.com/cloudfoundry/korifi/blob/63fece8d987b09744ea435bccf9af08813bc0611/HACKING.md#deploying-locally +- https://carvel.dev/blog/getting-started-with-ytt/ +- Need helm and kubectl and etc. +- https://kubernetes.io/docs/tasks/tools/install-kubectl-linux/ +- https://github.com/cloudfoundry/cli/wiki/V8-CLI-Installation-Guide#installers-and-compressed-binaries + +```console +$ git clone --depth=1 https://github.com/cloudfoundry/korifi +$ cd korifi/ +$ echo We should start mentioning which commit we pulled by checking it out after chdir +$ git checkout 63fece8d987b09744ea435bccf9af08813bc0611 +$ curl -L https://carvel.dev/install.sh | K14SIO_INSTALL_BIN_DIR=$HOME/.local/bin bash +$ curl https://raw.githubusercontent.com/helm/helm/main/scripts/get-helm-3 | bash +$ curl -LO "https://dl.k8s.io/release/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/amd64/kubectl" +$ chmod +x kubectl +$ mkdir -p ~/.local/bin +$ mv ./kubectl ~/.local/bin/kubectl +$ kind delete cluster --name korifi-alice-shouldi-contribute || true +Deleting cluster "korifi-alice-shouldi-contribute" ... +$ ./scripts/deploy-on-kind.sh korifi-alice-shouldi-contribute --debug --verbose +$ (cd $(mktemp -d); curl -sfL "https://packages.cloudfoundry.org/stable?release=linux64-binary&version=v8&source=github" | tar zxv && chmod 755 cf{,8} && mv cf{,8} ~/.local/bin/) +``` + +- activitypub groups spec/fep + - https://codeberg.org/fediverse/fep/src/branch/main/feps/fep-1b12.md +- https://socialhub.activitypub.rocks/t/fep-5624-per-object-reply-control-policies/2723/34 + - > i think the current state of talks is to have an Accept activity for each activity, and this gets used as the replyApproval for the third-party observer to verify, but beyond that, there is no specified mechanism for how replies get approved logically. it may be manual, it may be automatic based on some criteria (or not). you could totally have an application feature where replies from certain people get automatically approved, and from anyone else it goes to a sort of “reply request” UI similar to follow requests. you could add or remove people to the “auto-approve” list as you pleased. +- https://codeberg.org/fediverse/fep/src/branch/main/feps/fep-cb76.md +- https://codeberg.org/fediverse/fep/src/branch/main/feps/fep-2e40.md#example-create-fep-term-eventsource + - Event source itself is similar to discovery of the /admin/websocket endpoint +- FEP-400e: Publicly-appendable ActivityPub collections +- https://forgefed.org/ +- https://codeberg.org/fediverse/delightful-activitypub-development#user-content-forge-federation +- https://f3.forgefriends.org/structure.html +- https://codeberg.org/fediverse/delightful-activitypub-development#bridges +- https://forgejo.org/2023-02-27-forgejo-actions/ +- https://codeberg.org/forgejo/runner +- https://forum.forgefriends.org/t/about-the-friendly-forge-format-f3/681 + - > ForgeFed is an [ActivityPub](https://www.w3.org/TR/activitypub/) extension. ActivityPub is an actor-model based protocol for federation of web services and applications. +- https://codeberg.org/forgejo/forgejo/issues/59 + - [FEAT] implement federation #59 +- These folks know what's up +- https://git.exozy.me/a?tab=activity + - https://git.exozy.me/a/website/src/commit/4672ed271dead5fdf8be7efc05e964c70924d7e9/content/posts/abusing-systemd-nspawn-with-nested-containers.md +- https://codeberg.org/earl-warren?tab=activity +- https://codeberg.org/dachary?tab=activity +- https://codeberg.org/forgejo/forgejo/issues/363 + - Where is the best place to discuss federation of CI? Maybe in the spec repo? Shall I just throw up a pull request on that GitLab with the schema? We're interested in folks rebroadcasting their GitHub webhooks, etc. into the ActivityPub space so as to enable live at HEAD in poly repo envs (to help secure rolling releases). + - Related: https://github.com/ietf-scitt/use-cases/issues/14 + - Related: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-5178869 +- https://codeberg.org/forgejo-contrib/forgejo-helm/issues/89#issue-257034 +- https://codeberg.org/forgejo/runner/issues/4#issue-255815 +- https://repos.goffi.org/libervia-backend/file/tip/CONTRAT_SOCIAL_en +- https://dream.public.cat/pub/dream-data-spec +- TODO + - [ ] poly repo live at HEAD + - [ ] Pin main branch issue ops via pull request after release / auto branch is cut and container image sha is known. + - [ ] Auto merge + - [ ] (Skip this and just commit and push to start) + - [ ] https://github.com/jef/conventional-commits-release-action + - [ ] Example of `alice threats listen activitypub -stdin` + - Base flow just helps us take file representations of + - [x] Respond to Carina + - [ ] https://github.com/intel/dffml/blob/80e773712897a2fa2fb93e6abd4f852302adb79f/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md#checklist + - Still a good checklist + - [ ] https://socialhub.activitypub.rocks/t/fep-5624-per-object-reply-control-policies/2723 + - Bingo! \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0194/reply_0001.md b/docs/discussions/alice_engineering_comms/0194/reply_0001.md new file mode 100644 index 0000000000..b4f8d925b1 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0194/reply_0001.md @@ -0,0 +1,54 @@ +## 2023-03-02 SBOM, VEX, VDR, Threat Modeling, Open Architecture + +- 1:1 Anthony/John +- https://github.com/anthonyharrison/sbom-manager + - Related: https://github.com/intel/dffml/issues/596 +- Alma and debian + - Firefox delta + - #789 +- Manager to search across modules +- CSAIF directory so others can get them + - Data consistency in the wild is suboptimal + - Version numbers? Why can't we use PURL, etc. +- CSAIF is doesn't look like it's about to align with Cyclone in the future +- OpenVEX + - Sudden release + - Walled release process + - Doesn't have most of the information we need +- Cylone VEX which cve-bin-tool supports +- SPDX is stalled for their VEX +- We need a way to say should you be worried about this vuln? + - THERE IS NO WAY TO MAKE THIS DECERNATION WITHOUT THE DEPLOYMENT CONTEXT! +- Cyclone might evolve OBOM for arch links + - Threat model overlays +- How could we do events of new vulns? + - [RFCv2: IETF SCITT: Use Case: OpenSSF Metrics: activitypub extensions for security.txt](https://github.com/ietf-scitt/use-cases/blob/22afd537180d6c6b2d5ec4db0096f0706cb2b6bc/openssf_metrics.md) +- People started petitioning NVD to downscore CVSS +- What are reasons for not patching? + - Threat model business objectives says strategic plan says out of scope + - Downstream propagate? No! Overlay for my context says we're running this sandboxed and it's critical, propigate affected instead of not affected to this TS +- Hardware and infra is not all up to date in the wild! + - Some people MUST run Windows NT! If they don't their business stops running + - Upgrade paths a MUST, layered protection level / ring increase with higher level TCB +- Users will always find interesting ways to use things, THREATS.md +- What if we don't have a threat model from the original author? + - Correctness? Review system baked in, this is our `vuln is bug` type fundamentally + - https://forums.lutris.net/ +- Can we talk to consumers? + - Neither of us know what consumers of + - How do you facilitate the competitive market and innovation? + - Open Source / Inner Source style encouragement to share? +- Anthony has also worked on Open Architecture (different thing, same name) +- If you submit a vuln, we should assume reported to will do nothing + - Unless there is reviews saying they respond to vulns, but we should wait our default period before reporting anyway +- How can we leverage federated machine learning to do audit and analysis? +- Focus on getting eco system working with some major projects, k8s, some OSes, you'd want a stack +- You'll never have no vulns + - If you have no vulns you're a dead project + - Learning is growing + - Are the vulns getting better or worse? + - This is what we care about, acceleration + - As long as versioned learning says you're within bounds, keep going! +- Want to enable users to make trust determinations +- Trust levels (protection rings) +- GitHub sounds open to partnership to make things better if we think of anything to talk to them about \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0195/index.md b/docs/discussions/alice_engineering_comms/0195/index.md new file mode 100644 index 0000000000..2eb62250d3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0195/index.md @@ -0,0 +1 @@ +# 2023-03-03 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0195/reply_0000.md b/docs/discussions/alice_engineering_comms/0195/reply_0000.md new file mode 100644 index 0000000000..3d8c606976 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0195/reply_0000.md @@ -0,0 +1,139 @@ +## 2023-03-03 @pdxjohnny Engineering Logs + +- Do another manifest conversion into a SLRUM job (as schema) + - Thank you Dave Florey! 🥳🥳🥳 + - https://slurm.schedmd.com/quickstart.html +- We want SPIFFE interop +- https://aurae.io/#expanded-overview + - > Authentication: Aurae extends [SPIFFE](https://github.com/spiffe)/[SPIRE](https://github.com/spiffe/spire) (x509 mTLS)-backed identity, authentication (authn), and authorization (authz) in a distributed system down to the Unix domain socket layer. + - Forge local fulcio +- https://aurae.io/blog/2022-10-24-aurae-cells/#aurae-spawn + - > Aurae Spawn: The name Spawn() is taken from the Rust std::process crate and resembles a pattern what most Linux users will know as unshare(2) or namespace delegation. Basically a spawned instance of Aurae will inherit certain properties from the parent, and will come with a few basic guarantees with regard to security and connectivity. + Aurae is designed to be recursive, which enables nested isolation zones and gives the project the basic building blocks it needs to hold an opinion on how users should run workloads. Spawned Aurae instances will receive a bridged TAP network device which a nested auraed daemon will listen on by default. This allows a parent Aurae instance running with an independent kernel to communicate directly with a child instance over the same mTLS authenticated gRPC API the rest of the project leverages. `rpc Spawn(Instance) returns (InstanceStatus) {}` Aurae will manage creating an ephemeral [SPIFFE](https://github.com/spiffe/spiffe) service identity for each spawned instance and will delegate down kernel images, initramfs, and even the auraed daemon itself. +- https://blog.envoyproxy.io/securing-the-service-mesh-with-spire-0-3-abb45cd79810 +- https://github.com/aurae-runtime/auraed/tree/bff23e58fcea2ab877b391adee39bfa8fd14fd4e/stdlib/v0 + - Best practice! Versioning within a lib. + - https://github.com/future-highway + - We're on our way to helping Alice onramp data from offline caches back into the data super highway of the future + - Ref: data super highway of the future, early engineering logs +- https://slurm.schedmd.com/mpi_guide.html#intel_mpi + - This looks like devcloud + - #1247 + - We'll run this within the cf lobs +- Run everything out of GitHub, but also with the ability to run it grafted. All at the same time, just by rebroadcasting. + - GitHub is test env + - Mirror execution env is prod + - Loosly coupled means we are doing the same thing as versioned learning. + - On propagation, does it fit within allowlist of SCITT instance squishy version range (set)? + - Do you want to run the query + policy evaluation? + - You can look at the dataflow before you run it. And overlay your policy to evaluate propagation as a gatekeeper or itself overlay policy onto the dataflow for contexat aware tailoring before execution. + - You can say, I'll execute many manifests that unpack into SLURM manifests + - You'd do this by having a downstream listener which execute the shim to transform into the SLURM version of `qsub` + - This is our `alice threats listen activitypub -korifi` +- https://github.com/transmute-industries/jsonld-github-action + - For reverse of shim +- ActivityPub extensions for security.txt + - Can you put things in `@context`?, yes. Unsure if other servers will propagate events. + - It this piggybacking within the content approach interoperable today, yes. +- Somewhere, something happened + - Bob tells Alice what happened + - Alice decides, do I care about what whappened? (the federated event) + - It's the triage process + - https://github.com/intel/cve-bin-tool/issues/2639 + - Take upstream policy (attached to incoming via `inReplyTo` and or `replies`, you'd have to decide if you want to dereference these, perhaps based on reputaion of propagator to reduce attack impact) +- A container image was created (`FROM` rebuild chain) + - Bob's forge tells Alice's forge, here's the content address uri for the manifest just pushed + - Alice looks at the manifest, runs through all the packages she's maintaining in her forge + - She applies the threat model of each as an overlay when determining if she wants to propagate into her internal environment + - If any of these + - Alice's downstream listener executes a system context to system context translation (grep: equilibrium, context-to-context) + - She executs the shim + - #1273 + - It parses the content in alignment with the schema + - The shim already supports validation so we could actually just serialize the would be HTTP requests to files (same as staged when offline) + - https://github.com/intel/dffml/pull/1273/files#r794027710 + - Could add activity style using this operation (function) as upstream, just copy paste and push to shim + - https://github.com/intel/dffml/blob/e1914f794c7ccc3a7483fa490cfbe5170bf65972/dffml/util/testing/manifest/shim.py#L744-L757 + - https://github.com/tern-tools/tern#report-cyclonedxjson + - Upload resulting SBOM to registry `FROM scratch` style or via + - https://github.com/opencontainers/image-spec/blob/819aa940cae7c067a8bf89b1745d3255ddaaba1d/artifact.md + - https://github.com/opencontainers/image-spec/blob/819aa940cae7c067a8bf89b1745d3255ddaaba1d/descriptor.md#examples +- A SBOM was published + - Bob's forge uploads an SBOM to the registry + - Alice's forge decides if she wants to propagate it (prioritizer, gatekeeper, umbrella) + - Alice looks at the manifest, runs through all the packages she's maintaining in her forge + - She applies the threat model of each as an overlay when determining if she wants to propagate into her internal environment + - If any of these use similar components as were mentioned in this SBOM, propagate + - Alice's listener receives the new SBOM event + - She uploads a manifest instance of a SLURM submit job spec to her registry + - https://slurm.schedmd.com/rest_api.html#slurmV0038SubmitJob +- A manifest instance of a SLURM submit job was published to Alice's registry + - Bob's forge uploads an SBOM to the registry + - Alice's forge decides if she wants to propagate it (prioritizer, gatekeeper, umbrella) + - Alice looks at the manifest, runs through all the packages she's maintaining in her forge + - She applies the threat model of each as an overlay when determining if she wants to propagate into her internal environment + - If any of these use similar components as were mentioned in this SBOM, propagate + - Alice's listener within korifi receives the new IPMV///SLURM submit job event + - She downloads the job contents from the manifest + - `FROM scratch`, `results.yaml` extraction style tar pipe + - She executes the shim + - The next phase parser runs kaniko + - `grep ' Push' | awk '{print $NF}' | sed -e 's/.*@sha/sha/' -e 's/.*://g' | sed -e 'N;s/\n/=/'` +- #1399! + - Where is Here? + - Now! + - :) + +```console +$ gh pr -R https://github.com/intel/dffml merge --rebase --auto 1406 +``` + +- https://github.com/ietf-scitt/cose-merkle-tree-proofs/pull/12 +- https://github.com/securefederatedai/openfl/blob/develop/docs/running_the_federation.rst +- https://github.com/securefederatedai/openfl/blob/develop/docs/running_the_federation.rst#aggregator-based-workflow +- https://openfl.readthedocs.io/en/latest/running_the_federation.html#federation-api +- https://github.com/securefederatedai/openfl/blob/develop/tests/openfl/transport/grpc/test_director_server.py +- https://github.com/securefederatedai/openfl/blob/58efdcc57f477f031a58ab8995fade57ca02643f/tests/openfl/transport/grpc/test_director_server.py +- https://openfl.readthedocs.io/en/latest/install.html#productname-with-docker +- https://openfl.readthedocs.io/en/latest/workflow_interface.html +- https://openfl.readthedocs.io/en/latest/source/openfl/communication.html +- https://github.com/jenkinsci/opentelemetry-plugin#using-the-opentelemetry-otlphttp-rather-than-otlpgrpc-protocol +- https://github.com/jenkinsci/opentelemetry-plugin/blob/9061f4a915e5b8bf65ffe10393c55530b41162ab/src/main/kibana/jenkins-kibana-dashboards.ndjson +- https://github.com/jenkinsci/opentelemetry-plugin/blob/9061f4a915e5b8bf65ffe10393c55530b41162ab/src/main/java/io/jenkins/plugins/opentelemetry/opentelemetry/common/OffsetClock.java#L36 + - grep clock skew +- https://codeberg.org/Codeberg/forgejo + - Codeburg has a fork, shows engagement from community +- https://codeberg.org/Codeberg/avatars + - For Alice/entity instances +- https://inqlab.net/git/ocaml-xmppl.git/ +- https://inqlab.net/git/guile-datalog.git/ +- https://github.com/lindig/polly + - OCaml bindings for Linux epoll(2) +- ActivityPub maintainer Christine Lemmer-Webber talked about this on mastodon + - https://spritely.institute/goblins/ + - https://spritely.institute/files/docs/guile-goblins/0.10/OCapN-The-Object-Capabilities-Network.html#OCapN-The-Object-Capabilities-Network + - https://docs.racket-lang.org/goblins/captp.html#%28part._.Cap.T.P_usage_example%29 + - **ALIGNED** + - https://octodon.social/@quinn/109955448257454151 + - https://docs.racket-lang.org/goblins/captp.html#%28part._.Fake_.Intarwebs%29 + - 🛤️🛤️🛤️🛤️🛤️🛤️🛤️ + - https://pkgs.racket-lang.org/package/goblins + - https://pkg-build.racket-lang.org/server/built/install/goblins.txt +- https://github.com/aurae-runtime/aurae/pull/437 + - Wardly map: Future libvirt ^ + - Best practice: rust: Vendoring in `creates/` + - Alice could help facilitate tracking upstream for `overlays/` where overlays are distro package style patchsets / dataflow / manifest as patchset + - This is that evolution of QEMU we've wanted!!!!!!!!!!!! + - https://github.com/aurae-runtime/aurae/pull/437/commits/ce682c5936c1e0df5863b07734f6ffbe9c5c6fd3#diff-a9b9110f95a34509551c21058f6a1a2d3aa928a9fd11bd248d0bdbb47c03ee75 + - Now to hook the reverse fuzzer (codegen) / fuzzer loop up +- https://github.com/containers/youki +- https://gzigzag.sourceforge.net/nutshell.html +- https://github.com/krisnova/home +- https://github.com/WebOfTrust/keripy/blob/development/src/keri/demo/demo.md +- TODO + - [ ] Play with ActivityPub tags seen yesterday for potential as flat file serializable with eventing on rejoin #1400 + - [ ] A VEX was published... + - See recent meetings with Anthony involved + - [x] Reach out to intel/openfl maintainer about federation protocol + - Patrick Foley + - [x] https://github.com/intel/open-ecosystem-ref-code \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0196/index.md b/docs/discussions/alice_engineering_comms/0196/index.md new file mode 100644 index 0000000000..33d629d028 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0196/index.md @@ -0,0 +1 @@ +# 2023-03-04 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0197/index.md b/docs/discussions/alice_engineering_comms/0197/index.md new file mode 100644 index 0000000000..8f6c344031 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0197/index.md @@ -0,0 +1 @@ +# 2023-03-05 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0197/reply_0000.md b/docs/discussions/alice_engineering_comms/0197/reply_0000.md new file mode 100644 index 0000000000..879d61face --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0197/reply_0000.md @@ -0,0 +1,116 @@ +## 2023-03-05 @pdxjohnny Engineering Logs + +- Vol 6: Time Travel with Us: Plotting Jumps + - https://www.theguardian.com/science/2020/jan/11/how-astrology-paved-way-predictive-analytics + - TODO grep here for source of following video of the graphs ends finding each other. Validate wardly paths + - https://user-images.githubusercontent.com/5950433/222974908-2f6f1a39-e868-45f3-8460-db13d22bb7d0.mp4 +- https://github.com/intel/dffml/issues/1287#issuecomment-1455147140 +- [Collaboration Hub: A place for starting SC projects SocialCoding/sc-guild#2](https://codeberg.org/SocialCoding/sc-guild/issues/2) + - `ALIGNMENT.md` + - A guild as a type of ad-hoc organization which might be relevant during overlay application + - Guilds, much like working groups, might have documentation they all agree on about what it alignment to their values means + - We can then have Alice help guild members ensure their contributions stay aligned +- https://delightful.club/delightful-linked-data/#fediverse-specifications +- https://www.w3.org/TR/activitystreams-core/#extension-round-trip +- https://github.com/go-gitea/gitea/issues/18240 + - https://gitea.com/xy/gitea/issues/3 +- grep Online Cloning Cuts Our Iteration Time + - Federated repos are Online Cloning + - As we add federated CI/CD we'll enable online overlay application + - This lays the foundations for the automation of the vendoring process and tracking of upstream +- https://textbook.sfsd.io/overview + - `ALIGNMENT.md` + - https://youtu.be/hZpKdfbrd6o?t=601 + - Haven't listened to this just skimmed transcript but looks like touching on accelerating timelines +- https://community.humanetech.com/t/be-a-builder-help-improve-wellbeing-freedom-and-society/3322 +- Updated thread title from [Alice Engineering Comms](https://github.com/intel/dffml/discussions/1406?sort=new#top) to [Alice Engineering Comms 🪬](https://github.com/intel/dffml/discussions/1406?sort=new#top) +- https://codeberg.org/forgejo-contrib/discussions/issues/12 + +![thus-begins-the-software-trade-federation](https://user-images.githubusercontent.com/5950433/222979438-19d7ef05-afc2-43f8-a7f5-6bc2240c5f11.png) + +- Align DFFML CI on everything as manifest based container builds + +![oci-all-the-things](https://user-images.githubusercontent.com/5950433/222979759-0dd374b2-ee5f-4cbc-92d1-5cb8de078ee8.png) + +- https://codeberg.org/forgejo/forgejo/src/branch/forgejo/CONTRIBUTING/WORKFLOW.md#federation-https-codeberg-org-forgejo-forgejo-issues-labels-79349 +- https://codeberg.org/forgejo/forgejo + - > ActivityPub-based forge federation protocol https://forgefed.org/ +[woodpecker-ci](https://codeberg.org/explore/repos?q=woodpecker-ci&topic=1) +[activitypub](https://codeberg.org/explore/repos?q=activitypub&topic=1) +[federation](https://codeberg.org/explore/repos?q=federation&topic=1) +[specification](https://codeberg.org/explore/repos?q=specification&topic=1) +[specs](https://codeberg.org/explore/repos?q=specs&topic=1) +[forgefed](https://codeberg.org/explore/repos?q=forgefed&topic=1) +- https://woodpecker-ci.org/docs/intro + - > Woodpecker is a simple CI engine with great extensibility. It runs your pipelines inside [Docker](https://www.docker.com/) containers, so if you are already using them in your daily workflow, you'll love Woodpecker for sure. +- https://woodpecker-ci.org/docs/development/architecture +- https://github.com/woodpecker-ci/woodpecker + - YAY!!!! DRONE IS BACK!!!!!! +- https://github.com/woodpecker-ci/woodpecker/pull/1543 + +![chaos-for-the-chaos-god](https://user-images.githubusercontent.com/5950433/220794351-4611804a-ac72-47aa-8954-cdb3c10d6a5b.jpg) + +- Now we need to find where the woodpecker telemetry is, and figure out what needs to be aligned across that and the ForgeFed `context.jsonld` + - https://codeberg.org/ForgeFed/ForgeFed/src/commit/467dfe84670750a61992c5c1da3841e9453c1d36/rdf/context.jsonld +- https://github.com/woodpecker-ci/woodpecker/search?q=telemetry&type=issues + - https://github.com/woodpecker-ci/woodpecker/issues/198 + - https://github.com/woodpecker-ci/woodpecker/issues/751 +- Open Telemetry -> federated event space + +![knowledge-graphs-for-the-knowledge-god](https://user-images.githubusercontent.com/5950433/222981558-0b50593a-c83f-4c6c-9aff-1b553403eac7.png) + +- https://opentelemetry.io/ecosystem/registry/?s=activitypub&component=&language= + - No items found +- Then we onramp into the federated ML space. Then we align training with guilds. Then we work to organize work item prioritization across entities to maximize rate of learning. Once we max out that rate of learning given all the entities working on trains of thought, that's when we've hit critical velocity. +- https://codeberg.org/forgejo/forgejo/pulls/485 + - Chaos for the Chaos God again! + - > 26 minutes ago + - They have two branches we need right now and this pull request aligns them + - https://codeberg.org/forgejo/forgejo/src/commit/3caec9d9ebde243b7e4a8ee03e05b6a89aaf337e/CONTRIBUTING/WORKFLOW.md#federation-https-codeberg-org-forgejo-forgejo-issues-labels-79349 + - > [forgejo-ci](https://codeberg.org/forgejo/forgejo/src/branch/forgejo-ci) based on [main](https://codeberg.org/forgejo/forgejo/src/branch/main) Woodpecker CI configuration, including the release process. + > + > [forgejo-federation](https://codeberg.org/forgejo/forgejo/src/branch/forgejo-federation) based on [forgejo-development](https://codeberg.org/forgejo/forgejo/src/branch/forgejo-development) Federation support for Forgejo +- What are the existing CI events? + - Let's see what events we get from both the webhook events rebroadcast from ForgeJo are + - And what they would include if we also rebroadcast the events from the runner +- https://codeberg.org/forgejo/runner/issues/5 + - I was spining this this weekend as well. I see you've rebased in [forgejo/forgejo#485](https://codeberg.org/forgejo/forgejo/pulls/485) + - Related: [forgejo-contrib/discussions#12](https://codeberg.org/forgejo-contrib/discussions/issues/12) + - How can I help with this? My plans are currently to figure out what events are being sent from the runner that could be sent in the format of the other events using context.jsonld and update that file as needed if there are more data types that become relevant. Please let me know if this sounds aligned with your thoughts in this space or if you have any other thoughts on how best to proceed. +- We are currently in the example setup for beyond live at HEAD discussed with Andy recently at Drew's Linux Kernel meetup + - We have an at least three branches at play just to start working on this. Trunk based development is great but we have to facilitate the enherant lack thereof across these in progress branches via virtual branches. + - The PR which rebases `development` into `ci` + - The wookpecker PR which adds support for forgejo + - A new one we'll be activly working on, our virtual branch + - Upstream + - forgejo:federation + - Overlays + - forgejo:ci + - Overlays + - Any patches needed to rebase ci onto federation +- https://codeberg.org/pdxjohnny/runner/src/branch/federation-cd/ +- https://socialhub.activitypub.rocks/t/anybody-knows-a-fediversed-market-place-software/2995 + - #1207 + - #1061 +- Sic semper tyrannis + - https://en.wikipedia.org/wiki/March_5 +- TODO + - [ ] Alice tests for please log todos + - [ ] Split issue creation into issue body creation, create single issue + - [ ] For test, operation to check issue body, input as static render, mock issue creation call + - [ ] Mention in docs to update static form if need be, or switch to custom validation operation + - [ ] Spin Federated Forge + - [ ] Align DFFML CI on everything as container builds + - [x] Start discussion thread on federation of CI/CD events + - https://codeberg.org/forgejo-contrib/discussions/issues/12 + - [ ] Python Package to SBOM to Dataflow to wookpecker synthesis + - https://codeberg.org/ForgeFed/ForgeFed/src/commit/467dfe84670750a61992c5c1da3841e9453c1d36/.woodpecker/deploy.yml + - https://github.com/intel/dffml/issues/1421 + - [ ] Open Telemetry -> Federated Event space + - Analysis for addition to forge federation context.jsonld + - Similar to gamified threat modeling, can we `alice please contribute` via PR possible transformation from the open telemetry event space. So we essentailly incrementally learn how to transform telemetry events (data flow events as telemetry events) into activitypub events. Now everything can talk directly to everything + - GraphQL-LD over LDF + - Cypher import of KERIVC + - [ ] Online mirror translation into git vendor with sha384 patches as overlays + - [ ] Federate events into traceability-interop space + - [ ] KERIVC for protection ring -2 transport for duplicity checking \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0198/index.md b/docs/discussions/alice_engineering_comms/0198/index.md new file mode 100644 index 0000000000..f31e6a8169 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0198/index.md @@ -0,0 +1 @@ +# 2023-03-06 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0198/reply_0000.md b/docs/discussions/alice_engineering_comms/0198/reply_0000.md new file mode 100644 index 0000000000..258786d09c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0198/reply_0000.md @@ -0,0 +1,63 @@ +## 2023-03-06 @pdxjohnny Engineering Logs + +- https://codeberg.org/forgejo/forgejo/src/commit/2fe3a45685545079eb4e82f1954eadf7e065333b/CONTRIBUTING/WORKFLOW.md +- https://codeberg.org/forgejo/forgejo/src/branch/forgejo/CONTRIBUTING/WORKFLOW.md#forgejo-branch +- https://github.com/goreleaser/goreleaser-action +- https://github.com/intel/project-example-for-python + - Example online clones, pull request CI for basic python package build and test, submit pull request if federated CI/CD events result in built container for manifest. +- Add ssh key to codeberg/gittea + +```console +$ echo -n 'f530738005ef4d09962beb8ad11dabe021f215cab37a3212fc81ed3513c42e99' | ssh-keygen -Y sign -n gitea -f ~/.ssh/id_rsa.pub +``` + +- https://codeberg.org/forgejo/forgejo/pulls/485#issuecomment-826512 + - https://codeberg.org/forgejo-contrib/soft-fork-tools + - https://codeberg.org/forgejo/forgejo/src/branch/forgejo-development/CONTRIBUTING/WORKFLOW.md#development-workflow + - https://docs.gitea.io/en-us/hacking-on-gitea/ + - https://docs.gitea.io/en-us/hacking-on-gitea/#building-gitea-basic +- `make test` hangs + +```console +$ make test +npm install --no-save +npm WARN deprecated sourcemap-codec@1.4.8: Please use @jridgewell/sourcemap-codec instead + +added 850 packages in 11s +npx vitest + + RUN v0.27.2 /home/pdxjohnny/go/src/codeberg/forgejo/forgejo + + ✓ web_src/js/utils.test.js (13) + ✓ web_src/js/features/repo-findfile.test.js (4) + ✓ web_src/js/features/repo-code.test.js (2) + ✓ web_src/js/svg.test.js (1) + + Test Files 4 passed (4) + Tests 20 passed (20) + Start at 04:30:40 + Duration 2.39s (transform 125ms, setup 61ms, collect 609ms, tests 73ms) +``` + +- Trying https://docs.gitea.io/en-us/hacking-on-gitea/#building-gitea-basic + +```console +$ mkdir -p ~/go/src/codeberg/forgejo/ +$ git clone -b v1.19/forgejo-ci https://codeberg.org/forgejo/forgejo ~/go/src/codeberg/forgejo/forgejo +$ cd ~/go/src/codeberg/forgejo/forgejo +$ make watch +$ git grep -i activitypub +$ git log -n 1 +commit 823ab34c64b275bf57fa60fef25a67338d8cb26e (HEAD -> v1.19/forgejo-ci, origin/v1.19/forgejo-ci) + +``` + +- Grep yields plenty of lines/results +- https://github.com/clearlinux-pkgs/libvirt +- Forgejo Actions runner + - > Runs workflows found in .forgejo/workflows, using a format similar to GitHub actions but with a Free Software implementation. It is compatible with Forgejo v1.19.0-0-rc0 +- From Vadim + - https://code.themlsbook.com/ + - https://themlsbook.com/read + - https://acrobat.adobe.com/link/review?uri=urn:aaid:scds:US:b7ad98b3-80ec-44cd-9d16-741f83ff2aaa#pageNum=12 +- https://stedolan.github.io/jq/manual/#recurse(f) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0199/index.md b/docs/discussions/alice_engineering_comms/0199/index.md new file mode 100644 index 0000000000..b4f940618f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0199/index.md @@ -0,0 +1 @@ +# 2023-03-07 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0199/reply_0000.md b/docs/discussions/alice_engineering_comms/0199/reply_0000.md new file mode 100644 index 0000000000..779a75a3e9 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0199/reply_0000.md @@ -0,0 +1,172 @@ +## 2023-03-07 @pdxjohnny Engineering Logs + +- https://www.fastcompany.com/90859722/you-can-poison-ai-datasets-for-just-60-a-new-study-shows +- https://github.com/Azure-Samples/active-directory-verifiable-credentials-python +- Cleaned up random forks used for testing +- Investigating existing activitypub code within forgejo + - To facilitate comms (Continuous Delivery of Living Threat Models) as part of Alice's Stream of Consciousness + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md +- https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md?plain=1 + +[![use-the-source](https://img.shields.io/badge/use%20the-source-blueviolet)](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md#use-the-source-) + +```console +$ git status +On branch v1.19/forgejo-ci +Your branch is up to date with 'origin/v1.19/forgejo-ci'. + +nothing to commit, working tree clean +$ git log -n 1 +commit 823ab34c64b275bf57fa60fef25a67338d8cb26e (HEAD -> v1.19/forgejo-ci, origin/v1.19/forgejo-ci) +Author: Loïc Dachary +Date: Mon Feb 20 23:17:52 2023 +0100 + + [CI] set PASSWORD_HASH_ALGO = argon2 for integration tests + + (cherry picked from commit 1d7ce2a39c841e77492ef08c0e86c3544ecca88d) + (cherry picked from commit 1abfc0c0a17a429102ba5f70b874263cc7b2ecf8) +$ git grep -i activitypub +CHANGELOG.md: * User keypairs and HTTP signatures for ActivityPub federation using go-ap (#19133) +assets/go-licenses.json: "name": "github.com/go-ap/activitypub", +assets/go-licenses.json: "path": "github.com/go-ap/activitypub/LICENSE", +go.mod: github.com/go-ap/activitypub v0.0.0-20221209114049-1ceafda50f9f +go.sum:github.com/go-ap/activitypub v0.0.0-20221209114049-1ceafda50f9f h1:UV5kupaU8AP8g8Bbsn53q87XCufW/E8wvnTHDKqjoR4= +go.sum:github.com/go-ap/activitypub v0.0.0-20221209114049-1ceafda50f9f/go.mod h1:1oVD0h0aPT3OEE1ZoSUoym/UGKzxe+e0y8K2AkQ1Hqs= +models/user/setting_keys.go: // UserActivityPubPrivPem is user's private key +models/user/setting_keys.go: UserActivityPubPrivPem = "activitypub.priv_pem" +models/user/setting_keys.go: // UserActivityPubPubPem is user's public key +models/user/setting_keys.go: UserActivityPubPubPem = "activitypub.pub_pem" +modules/activitypub/client.go:package activitypub +modules/activitypub/client_test.go:package activitypub +modules/activitypub/client_test.go:func TestActivityPubSignedPost(t *testing.T) { +modules/activitypub/main_test.go:package activitypub +modules/activitypub/user_settings.go:package activitypub +modules/activitypub/user_settings.go: settings, err = user_model.GetSettings(user.ID, []string{user_model.UserActivityPubPrivPem, user_model.UserActivityPubPubPem}) +modules/activitypub/user_settings.go: if err = user_model.SetUserSetting(user.ID, user_model.UserActivityPubPrivPem, priv); err != nil { +modules/activitypub/user_settings.go: if err = user_model.SetUserSetting(user.ID, user_model.UserActivityPubPubPem, pub); err != nil { +modules/activitypub/user_settings.go: priv = settings[user_model.UserActivityPubPrivPem].SettingValue +modules/activitypub/user_settings.go: pub = settings[user_model.UserActivityPubPubPem].SettingValue +modules/activitypub/user_settings_test.go:package activitypub +modules/structs/activitypub.go:// ActivityPub type +modules/structs/activitypub.go:type ActivityPub struct { +routers/api/v1/activitypub/person.go:package activitypub +routers/api/v1/activitypub/person.go: "code.gitea.io/gitea/modules/activitypub" +routers/api/v1/activitypub/person.go: ap "github.com/go-ap/activitypub" +routers/api/v1/activitypub/person.go: // swagger:operation GET /activitypub/user/{username} activitypub activitypubPerson +routers/api/v1/activitypub/person.go: // "$ref": "#/responses/ActivityPub" +routers/api/v1/activitypub/person.go: link := strings.TrimSuffix(setting.AppURL, "/") + "/api/v1/activitypub/user/" + ctx.ContextUser.Name +routers/api/v1/activitypub/person.go: publicKeyPem, err := activitypub.GetPublicKey(ctx.ContextUser) +routers/api/v1/activitypub/person.go: ctx.Resp.Header().Add("Content-Type", activitypub.ActivityStreamsContentType) +routers/api/v1/activitypub/person.go: // swagger:operation POST /activitypub/user/{username}/inbox activitypub activitypubPersonInbox +routers/api/v1/activitypub/reqsignature.go:package activitypub +routers/api/v1/activitypub/reqsignature.go: "code.gitea.io/gitea/modules/activitypub" +routers/api/v1/activitypub/reqsignature.go: ap "github.com/go-ap/activitypub" +routers/api/v1/activitypub/reqsignature.go: req.Header("Accept", activitypub.ActivityStreamsContentType) +routers/api/v1/api.go: "code.gitea.io/gitea/routers/api/v1/activitypub" +routers/api/v1/api.go: m.Group("/activitypub", func() { +routers/api/v1/api.go: m.Get("", activitypub.Person) +routers/api/v1/api.go: m.Post("/inbox", activitypub.ReqHTTPSignature(), activitypub.PersonInbox) +routers/api/v1/misc/nodeinfo.go: Protocols: []string{"activitypub"}, +routers/api/v1/swagger/activitypub.go:// ActivityPub +routers/api/v1/swagger/activitypub.go:// swagger:response ActivityPub +routers/api/v1/swagger/activitypub.go:type swaggerResponseActivityPub struct { +routers/api/v1/swagger/activitypub.go: Body api.ActivityPub `json:"body"` +routers/web/webfinger.go: appURL.String() + "api/v1/activitypub/user/" + url.PathEscape(u.Name), +routers/web/webfinger.go: Href: appURL.String() + "api/v1/activitypub/user/" + url.PathEscape(u.Name), +templates/swagger/v1_json.tmpl: "/activitypub/user/{username}": { +templates/swagger/v1_json.tmpl: "activitypub" +templates/swagger/v1_json.tmpl: "operationId": "activitypubPerson", +templates/swagger/v1_json.tmpl: "$ref": "#/responses/ActivityPub" +templates/swagger/v1_json.tmpl: "/activitypub/user/{username}/inbox": { +templates/swagger/v1_json.tmpl: "activitypub" +templates/swagger/v1_json.tmpl: "operationId": "activitypubPersonInbox", +templates/swagger/v1_json.tmpl: "ActivityPub": { +templates/swagger/v1_json.tmpl: "description": "ActivityPub type", +templates/swagger/v1_json.tmpl: "ActivityPub": { +templates/swagger/v1_json.tmpl: "description": "ActivityPub", +templates/swagger/v1_json.tmpl: "$ref": "#/definitions/ActivityPub" +tests/integration/api_activitypub_person_test.go: "code.gitea.io/gitea/modules/activitypub" +tests/integration/api_activitypub_person_test.go: ap "github.com/go-ap/activitypub" +tests/integration/api_activitypub_person_test.go:func TestActivityPubPerson(t *testing.T) { +tests/integration/api_activitypub_person_test.go: req := NewRequestf(t, "GET", fmt.Sprintf("/api/v1/activitypub/user/%s", username)) +tests/integration/api_activitypub_person_test.go: assert.Regexp(t, fmt.Sprintf("activitypub/user/%s$", username), keyID) +tests/integration/api_activitypub_person_test.go: assert.Regexp(t, fmt.Sprintf("activitypub/user/%s/outbox$", username), person.Outbox.GetID().String()) +tests/integration/api_activitypub_person_test.go: assert.Regexp(t, fmt.Sprintf("activitypub/user/%s/inbox$", username), person.Inbox.GetID().String()) +tests/integration/api_activitypub_person_test.go:func TestActivityPubMissingPerson(t *testing.T) { +tests/integration/api_activitypub_person_test.go: req := NewRequestf(t, "GET", "/api/v1/activitypub/user/nonexistentuser") +tests/integration/api_activitypub_person_test.go:func TestActivityPubPersonInbox(t *testing.T) { +tests/integration/api_activitypub_person_test.go: user1url := fmt.Sprintf("%s/api/v1/activitypub/user/%s#main-key", srv.URL, username1) +tests/integration/api_activitypub_person_test.go: c, err := activitypub.NewClient(user1, user1url) +tests/integration/api_activitypub_person_test.go: user2inboxurl := fmt.Sprintf("%s/api/v1/activitypub/user/%s/inbox", srv.URL, username2) +tests/integration/schemas/nodeinfo_2.1.json: "activitypub", +tests/integration/webfinger_test.go: assert.ElementsMatch(t, []string{user.HTMLURL(), appURL.String() + "api/v1/activitypub/user/" + url.PathEscape(user.Name)}, jrd.Aliases) +``` + +- Conceptual analogies of #1315 / ActivityPub security.txt methodology for graph traversal + - Similar to GitHub discussion + - Each day is a thread from an activitypub group + - grep: time: now + - Towards context local time + - An entity can reply to the group (or another entity) and use that as their daily log, they add the group's daily log as a reply. This is like how we link issues and if we'll have them auto backref to the discussion thread using downstream watchers. This is the same way we can facilitate the review system notifications, the SARIF CD eventing. + - https://github.com/cli/cli/issues/5659#issuecomment-1138028169 +- https://grafeas.io/blog/introducing-grafeas + - > Decentralization and continuous delivery: The move to decentralize engineering and ship software continuously (e.g., “push on green”) accelerates development velocity, but makes it difficult to follow best practices and standards. + - Grafeas might have schema bits that would be good to look to source into F3 if licensing permits + - https://www.infoq.com/presentations/supply-grafeas-kritis/ + - Keynote: Software Supply Chains for Devops - Aysylu Greenberg, Google + - https://www.youtube.com/watch?v=2Wl0hoEt47E + - Keynote: Project Trebuchet: How SolarWinds is Using Open Source to Secure Their Supply Chain in the Wake of the Sunburst Hack - Trevor Rosen, SolarWinds + - https://youtu.be/1-tMRxqMwTQ?t=1413 + - Also talks about having a second build system building in parallel + - They also do vuln analysis with OPA + - This looks aligned to what we're trying to do, only we want federation protocol event space for interoperability rather than cloudevents +- https://tekton.dev/docs/pipelines/hermetic/ +- https://github.com/tektoncd/community/issues/435 +- https://github.com/tektoncd/experimental/pull/754 +- https://github.com/tektoncd/community/blob/main/teps/0008-support-knative-service-for-triggers-eventlistener-pod.md + - **ALINGED** + - KCP CRDs +- https://github.com/tektoncd/triggers/pull/958 + - Should we just go straight to the source and do the KCP/k8s manifest shim style translation? +- https://github.com/w3c-ccg/traceability-interop/issues/468#issuecomment-1459024175 +- https://github.com/tektoncd/experimental/blob/ce7bf94997343f44e46b0f7290573968af81df34/cloudevents/README.md +- https://cdevents.dev/ +- https://github.com/cdevents/spec/blob/8e8b3e0c4bf7656abd32a258a4a86b97e2d4d6f5/spec.md + - 2022-10-24: spec v0.1.1 released +- https://github.com/afrittoli +- Continuous Delivery Foundation (CDF) 2023 + - https://twitter.com/LoriLorusso/status/1584917240834670592/photo/2 + - > ![image](https://user-images.githubusercontent.com/5950433/223585282-09b2c638-76e7-4540-ab40-0fae0cd428e5.png) +- https://github.com/guacsec/guac/issues/251 +- https://github.com/guacsec/guac/issues/460 + - https://github.com/cloudevents/spec/blob/v1.0.2/cloudevents/http-webhook.md + - We could translate this into the federated CD event space +- https://github.com/cloudevents/spec/blob/v1.0.2/cloudevents/spec.md +- https://github.com/cloudevents/spec/pull/712 +- https://github.com/cloudevents/spec/issues/1146#issuecomment-1404225644 +- https://github.com/cloudevents/spec/issues/1162 +- https://gist.github.com/clemensv/b7d4c7e1f93f88021fa2f0edc0dee459 + - `Channel Identifier` in our case is the posts we include in `replies` or via `inReplyTo` +- https://github.com/cloudevents/spec/issues/1146#issuecomment-1403630146 + - Conversion of events +- https://www.drogue.io/ +- https://github.com/cloudevents/spec/issues/830 +- https://github.com/cloudevents/spec/blob/main/cloudevents/extensions/severity.md +- https://github.com/cloudevents/spec/blob/3877083f8396cfb01b7b3e8adf1738f248af3aff/subscriptions/subscriptions-openapi.yaml#L209 + - Can we introduce ActivityPub here? +- https://github.com/cdevents/spec/blob/main/cloudevents-binding.md +- https://github.com/cdevents/spec/blob/main/spec.md#cdevents-custom-data +- https://github.com/cdevents/spec/blob/main/continuous-deployment-pipeline-events.md +- https://github.com/cdfoundation/sig-mlops/blob/main/roadmap/2022/MLOpsRoadmap2022.md +- https://github.com/epec254/gpt-intuition +- https://github.com/evidentlyai/evidently +- https://github.com/w3c-ccg/traceability-interop/issues/485#issuecomment-1458700562 +- TODO + - [ ] GUAC federated event integration + - https://docs.google.com/document/d/15Kb3I3SWhq-9_R7WYhSjsIxn_FykYgPyFlQWlLgF4fA/edit + - https://docs.google.com/document/d/1BUEi7q2i-KXlAhsh1adYvL1fkWN-q8FrgLyEre7c5kg/edit?resourcekey=0-02sC5-9IbTfwJckze_CDQw# + - Very aligned + - [ ] GraphQL-LD with iter over outputs of flows converted from manifests into LDF + - [ ] Update OA WG chapters on federation + - https://codeberg.org/forgejo-contrib/discussions/issues/12 + - https://codeberg.org/forgejo/runner/issues/5#issuecomment-826244 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0200/index.md b/docs/discussions/alice_engineering_comms/0200/index.md new file mode 100644 index 0000000000..d7afa78bb0 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0200/index.md @@ -0,0 +1 @@ +# 2023-03-08 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0200/reply_0000.md b/docs/discussions/alice_engineering_comms/0200/reply_0000.md new file mode 100644 index 0000000000..e24246474f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0200/reply_0000.md @@ -0,0 +1,7 @@ +## 2023-03-08 @pdxjohnny Engineering Logs + +- https://cs.github.com/?scopeName=All+repos&scope=&q=%22just+setting+up+the+framework%2C+skip+to+the+bottom+to+see+the+real+code%22 +- https://github.com/vito/bass#whats-it-for +- https://github.com/dagger/dagger#runs-your-pipelines-in-containers +- https://docs.dagger.io/cli/389936/run-pipelines-cli#step-3-build-an-application-from-a-remote-git-repository +- https://docs.dagger.io/api/975146/concepts#lazy-evaluation \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0201/index.md b/docs/discussions/alice_engineering_comms/0201/index.md new file mode 100644 index 0000000000..0a33161c9e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0201/index.md @@ -0,0 +1 @@ +# 2023-03-09 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0201/reply_0000.md b/docs/discussions/alice_engineering_comms/0201/reply_0000.md new file mode 100644 index 0000000000..5f56b73665 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0201/reply_0000.md @@ -0,0 +1,12 @@ +## 2023-03-09 @pdxjohnny Engineering Logs + +- https://github.com/ossf/wg-vulnerability-disclosures/issues/125#issuecomment-1462869239 +- New data event + - Actor: pdxjohnny + - Manifest Schema (inReplyTo) Some vuln format + - Ref OpenSSF Metics SCITT Use Case, Roy's SIMPLE SCITT mailing list post + - Product: @pdxjohnny + - CVE-COVID-19 + - Status: affected +- https://tomalrichblog.blogspot.com/2023/02/is-vulnerability-exploitable-when-its.html + - Ref CVE Bin Tool Monthlys and recent meetings with Anthony \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0202/index.md b/docs/discussions/alice_engineering_comms/0202/index.md new file mode 100644 index 0000000000..5cb97eaf81 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0202/index.md @@ -0,0 +1 @@ +# 2023-03-10 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0202/reply_0000.md b/docs/discussions/alice_engineering_comms/0202/reply_0000.md new file mode 100644 index 0000000000..d2867fdab4 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0202/reply_0000.md @@ -0,0 +1,18 @@ +## 2023-03-10 @pdxjohnny Engineering Logs + +- https://github.com/ossf/wg-vulnerability-disclosures/issues/125#issuecomment-1463522574 + - JSON-LD comments from Ariadne on OpenVEX, presumably hinting at forthcoming Rapunzel + - https://github.com/w3c/activitypub/issues/319#issuecomment-419727935 + - https://web.archive.org/web/20190410204622/https://litepub.social/litepub/lice.html +- https://github.com/kaniini/libucontext + - Lwan uses this! +- https://github.com/kaniini/qemu-openrc + - We were looking for this when we were doing OS DecentrAlice a while back to POC image builds within container env that boot to UEFI +- https://github.com/ossf/wg-vulnerability-disclosures/issues/125#issuecomment-1464082034 + - A call for manifest ADR style format name format version +- https://github.com/ossf/wg-vulnerability-disclosures/discussions/127 +- https://docs.google.com/document/u/0/d/1ZT_w3HiW6LJjouRlw3xHXPnmy7ArwOdlw4vpzIjS9_o/ +- https://spdx.swinslow.net/p/spdx-defects-minutes +- https://github.com/ossf/wg-vulnerability-disclosures/discussions/127#discussioncomment-5271718 +- https://observer.com/2023/03/meta-is-reportedly-the-latest-social-media-company-to-embrace-activitypub-technology/ +- https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0203/index.md b/docs/discussions/alice_engineering_comms/0203/index.md new file mode 100644 index 0000000000..80ad99bd91 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0203/index.md @@ -0,0 +1 @@ +# 2023-03-12 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0203/reply_0000.md b/docs/discussions/alice_engineering_comms/0203/reply_0000.md new file mode 100644 index 0000000000..55523ff150 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0203/reply_0000.md @@ -0,0 +1,6 @@ +## 2023-03-12 @pdxjohnny Engineering Logs + +- https://knightcolumbia.org/content/understanding-social-media-recommendation-algorithms + - > Turning from similarity between users to similarity between posts, the most obvious attribute that could be used for computing post similarity is content. The term content in this context usually refers to metadata (say, the title and description of a video) and less commonly the full content (i.e., the byte stream). The idea is simple: If a user likes a video on a particular topic, they will probably like other videos on the same topic. To analyze content in this way, a set of “feature extraction” algorithms preprocesses posts and represents them in a form that’s more digestible to algorithms: as a series of attributes (features). A simple example of a feature is the language or languages that appear in a post. Other features may be much more complex. + - > The most important fact to keep in mind is that the behavioral record is the fuel of the recommendation engine. It might be surprising that recommendation algorithms are so simple to describe, given that large teams of highly skilled engineers work on them. But it takes a lot of ingenuity to translate high-level ideas of the sort I’ve described into an algorithm. In particular, keeping the computation tractable is a major challenge. The volume of information is vast: Based on the back-of-the-envelope calculations for TikTok above, the number of behavioral records may be of the order of a quadrillion (1015). A naive algorithm—for instance, one that attempted to compute the affinity between each user and each post—would be millions of times slower than an optimized one, and no amount of hardware power can make up the difference. + - We want to apply this to software and entities combining software as Alice does \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0204/index.md b/docs/discussions/alice_engineering_comms/0204/index.md new file mode 100644 index 0000000000..80ad99bd91 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0204/index.md @@ -0,0 +1 @@ +# 2023-03-12 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0205/index.md b/docs/discussions/alice_engineering_comms/0205/index.md new file mode 100644 index 0000000000..7b5c9ffda9 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0205/index.md @@ -0,0 +1 @@ +# 2023-03-13 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0205/reply_0000.md b/docs/discussions/alice_engineering_comms/0205/reply_0000.md new file mode 100644 index 0000000000..2b57fae7db --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0205/reply_0000.md @@ -0,0 +1 @@ +- https://simonwillison.net/2023/Mar/13/alpaca/ \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0206/index.md b/docs/discussions/alice_engineering_comms/0206/index.md new file mode 100644 index 0000000000..31fab35c94 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0206/index.md @@ -0,0 +1 @@ +# 2023-03-14 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0206/reply_0000.md b/docs/discussions/alice_engineering_comms/0206/reply_0000.md new file mode 100644 index 0000000000..2a704d85bf --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0206/reply_0000.md @@ -0,0 +1,2 @@ +- https://github.com/yuzutech/kroki +- https://github.com/typpo/quickchart \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0207/index.md b/docs/discussions/alice_engineering_comms/0207/index.md new file mode 100644 index 0000000000..31a3a68a53 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0207/index.md @@ -0,0 +1,25 @@ +Hi, + +Can someone explain to me this last message about +Yuzutech/kroki, especially this part +https://simonwillison.net/2023/Mar/13/alpaca/ ? + +Gentilmente, +Carina R.R. Silva + + +Em ter., 14 de mar. de 2023 às 19:13, John Andersen < +***@***.***> escreveu: + +> +> - https://github.com/yuzutech/kroki +> +> — +> Reply to this email directly, view it on GitHub +> , +> or unsubscribe +> +> . +> You are receiving this because you were mentioned.Message ID: +> ***@***.***> +> diff --git a/docs/discussions/alice_engineering_comms/0207/reply_0000.md b/docs/discussions/alice_engineering_comms/0207/reply_0000.md new file mode 100644 index 0000000000..d1a4ab8e4d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0207/reply_0000.md @@ -0,0 +1,29 @@ +Hello! + +- [yuzutech/kroki](https://github.com/yuzutech/kroki) + - Sometimes I just post miscellaneous possibly helpful links / stared repos within my daily engineering logs. This one might be helpful for rendering graphics that can't be rendered natively if they need to be communicated somewhere. For example, mermaid just introduced mind map functionality which could be useful for our use case. However, that functionality is not present within the version of mermaidjs that GitHub uses in their markdown rendering ruby gem. If we wanted to help Alice render mind maps, we'd have to deploy a rendering service such as kroki, convert to a format like SVG or PNG, and then use that within the markdown body. +- https://simonwillison.net/2023/Mar/13/alpaca/ + - This link was posted related to "depth of field mapping" (there is probably a better term for this, similar to [our risk mapping](https://github.com/intel/dffml/blob/11fea2bb0dd0aec3c19533e61d15d894c8112d25/docs/tutorials/rolling_alice/0001_coach_alice/0007_cartographer_extraordinaire.md)), meaning the act of mapping out the research in the aligned space. Since DFFML is all about wrapping existing models and ensuring [plumbing](https://www.techopedia.com/definition/31509/plumbing) is in place to use existing models easily, we're always posting links here for machine learning models that might be helpful. We also try to post the path we took to find those links, as we'll want to ensure we can automate this process so that Alice can also find the most recent research, to use as a base from which she'll hypothesize novel approaches. Whenever we get to that part of the project, we'll probably end up doing something like what's been done with the folks who have hooked up GPT-3 to search engines. We'll use our previous experiences as logged in this thread to understand how to fine tune the prioritizer as Alice surfs the web by making urlrequests. We'll work to ensure she looks for aligned research in as helpful a way as possible, prioritizing feeding the active execution loop with links that when added to the corpus of data are producing hypothesizes which have high alignment scores to whatever that active execution loop's strategic plans and principles are. + - There are a variety of things that make a link "of interest" within the aligned problem space + - Novel research, results, or approach to a problem + - Strong community support + - Strong publishing org support (aka they will support it going forward or build something new which we could migrate to if we decided to start using the N-1 version) + - Permissive licensing + - BSD, MIT, Apache-2.0, public domain, etc. + - Optimization + - Running on low cost hardware (aka not requiring large clusters or resources only companies or large institutions have access to) + - Alpaca is of interest because of + - Permissive licensing + - Apache-2.0 + - Optimization + - https://simonwillison.net/2023/Mar/11/llama/ + - > Large language models are having their Stable Diffusion moment + > + > The open release of the Stable Diffusion image generation model back in August 2022 was a key moment. I wrote how [Stable Diffusion is a really big deal](https://simonwillison.net/2022/Aug/29/stable-diffusion/) at the time. + > + > People could now generate images from text on their own hardware! + > + > More importantly, developers could mess around with the guts of what was going on. + +Thank you, +John \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0208/index.md b/docs/discussions/alice_engineering_comms/0208/index.md new file mode 100644 index 0000000000..783cd5aaf7 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0208/index.md @@ -0,0 +1 @@ +# 2023-03-15 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0208/reply_0000.md b/docs/discussions/alice_engineering_comms/0208/reply_0000.md new file mode 100644 index 0000000000..94330935fd --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0208/reply_0000.md @@ -0,0 +1,40 @@ +## 2023-03-15 @pdxjohnny Engineering Logs + +- https://github.com/scitt-community/scitt-api-emulator/issues/20#issuecomment-1470278224 + - Here is an example of using a file (workflow) as a payload: https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md#scitt-api-emulator-spin-up +- https://github.com/w3c/vc-jwt/pull/61 +- https://github.com/oauth-wg/oauth-selective-disclosure-jwt +- https://github.com/credential-handler/credential-handler-polyfill#demo + - Summary of end-to-end demo of secure build to boot to usage + - We'll want `did:keri` support for the end-to-end flow where we have software built within a TEE with KERI tied to the TEE's hardware root of trust. + - We'll export SCITT to a flat file format, we'll send it to the browser + - The browser will be able to auth to the software stack by pulling down the git repos involved and matching up the transparency service receipts/records with the git repos. + - Fully isolated (SLSA4+) setup + - This cuts out like, most of the way everything is done today in software. + - https://github.com/credential-handler/authn.io + - https://wallet.example.chapi.io/ + - https://issuer.example.chapi.io/ + - https://verifier.example.chapi.io/ + - https://github.com/TBD54566975/dwn-sdk-js + - https://github.com/TBD54566975/ssi-sdk-wasm + - https://github.com/TBD54566975/web5-js + - https://github.com/TBD54566975/web5-wallet-browser +- https://github.com/TBD54566975/ftl/pull/3#issue-1623361276 + - This looks like they are building distributed compute +- https://github.com/ggerganov/llama.cpp +- https://github.com/exaloop/codon +- [RFCv3.2: IETF SCITT: Use Case: OpenSSF Metrics: activitypub extensions for security.txt](https://github.com/ietf-scitt/use-cases/blob/da838e39cac8f5e2a444e7ac1d3c723e8ddd49ed/openssf_metrics.md#openssf-metrics) +- TODO + - [ ] Add `FROM scratch` image examples + - [ ] Add schema to output for the flow based on `Definition`s at `/schema.json` + - Use `@context` with zeroith index pointing to a manifest ADR schema + - Example: https://github.com/intel/dffml/blob/alice/schema/github/actions/build/images/containers/0.0.0.schema.json + - [ ] Find Source URL -> CVE mapping code as example of depth of field mapping in action + - https://github.com/pdxjohnny/dffml/branches + - [x] docs: tutorials: rolling alice: coach alice: down: the dependency rabbit hole again: plan: Threat model generation based on SBOM + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md + - https://github.com/intel/dffml/commit/02502ff3be0118a19ef83fbc71f17fd9403cb26a + - @Cat-Katze Just FYI, this tutorial, which is meant to be the creation of a basic/high-level threat model from a Software Bill Of Materials, is closely related to the https://github.com/intel/cve-bin-tool/issues/2639 activity. We'll eventually use the threat model plus the triage mechanism together as we preform automated vuln analysis. + - For more background, the https://github.com/ietf-scitt/use-cases/issues/14 is about how we can have the transparency service, which will be the source of truth for "is CVE-XYZ a vuln that affects product ABC" can interact with CI/CD systems to trigger auto triage per federated CI/CD eventing: https://codeberg.org/forgejo-contrib/discussions/issues/12. Since Open Source Software projects have different threat models based on how they might be deployed, each project will get an event, "new vuln!" when there is a new vuln. The downstream projects (projects which use a project, for example: dffml-model-tensorflow is downstream of DFFML) will get notifications of new vulns, the hope is we can bake in a pattern of analysis which can be followed as vulns cascade downstream for analysis / remediation within different contexts per their usage. + - [2023-03-02 SBOM, VEX, VDR, Threat Modeling, Open Architecture](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-5179079) + - https://tomalrichblog.blogspot.com/2023/02/is-vulnerability-exploitable-when-its.html \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0209/index.md b/docs/discussions/alice_engineering_comms/0209/index.md new file mode 100644 index 0000000000..e79d77d17a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0209/index.md @@ -0,0 +1 @@ +# 2023-03-16 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0209/reply_0000.md b/docs/discussions/alice_engineering_comms/0209/reply_0000.md new file mode 100644 index 0000000000..acec50222c --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0209/reply_0000.md @@ -0,0 +1,26 @@ +## 2023-03-16 @pdxjohnny Engineering Logs + +- https://docs.celeryq.dev/en/stable/getting-started/backends-and-brokers/redis.html + - We can feed data from the websocat into redis and use celery to kick off Alice + - https://docs.github.com/en/actions/using-containerized-services/creating-redis-service-containers + - This works for public GitHub runners, which we are using for OSS scanning. We may need tweaks for our OS DecentrAlice on DigitalOcean/DevCloud setup + - Then we can run matrix jobs which process incoming vulns to mitigate or analyze +- https://github.com/ossf/wg-vulnerability-disclosures/discussions/127#discussioncomment-5335373 (Jason's relevant comments below) + - > What the US government decides to do or not do is not my primary concern. I am trying to make the entire industry work better, to protect society - not just the US government. To do that requires us to work together, not fight each other over silly issues like "heaviness" of JSON formats that we want to be consumed by machines and never even read by a human. + > + > The whole VDR vs VEX thing I think is just needless additional confusion. When you take the technical implementations out of the mix, and just read what a VDR is and read what a VEX is, they are trying to do exactly the same thing, and it is all just semantics. You can actually use VEX to create a VDR - this is actually what CycloneDX is doing today. IMO, NIST did the industry a disservice inventing & pushing a new word for a concept that already existed. The ISO standard for VDR is also lacking 1/2 of VEX because it does not give a simple way to say 'I am not susceptible to this vulnerability, and here is why', which is a primary use case of VEX. however ironically, if you read the NIST best practice - they actually suggest this information be part of a VDR! IE - when you actually read all the text - the ISO minimal fields for VDR do not even meet what NIST is asking for... NIST VDR actually asks for a VEX! It is so needlessly confusing. + - > VEX contains both positive and negative assertions - just like a "VDR" does... I suggest you re-watch the video you linked because it is actually discussed, with an example. Again, no need to argue about this because none of it is worth arguing about. I know & respect both Allan and Thomas - but neither of them "owns" the definition of VEX, neither does OASIS, or anyone else. VEX is just a concept. Just like VDR is just concept, it is a best practice that NIST published in a document - these are abstract ideas, neither of them are standards. No one "owns" the definitions of these things, there is no NIST publication that officially defines what a VDR is... if there is, please share it. Simmilarly, there is no standards body at all that defines what a VEX is, CISA is looking to publish some guidelines, but CISA is not a standards body either so whatever gets published still won't define 'VEX' as a thing, it will simply define a CISA point of view. Anyone can claim anything is a VEX, because no one can say otherwise right now. + - Bingo +- ActivityPub security.txt/md mermaid where are you? 🧜‍♀️ + - https://cdn.jsdelivr.net/npm/mermaid@10.0.2/dist/mermaid.esm.min.mjs + - https://github.com/mermaid-js/mermaid/blob/b5a4cc0e17168c257a3b0d40a068e3addfc9c40a/packages/mermaid/src/docs.mts#L51 + - https://cdn.jsdelivr.net/npm/mermaid@10.0.2/ + - https://cdn.jsdelivr.net/npm/mermaid@9.3.0/dist/mermaid.min.js + - https://cdn.jsdelivr.net/npm/mermaid@10.0.2/dist/mermaid.min.js + - 10.0.2 does not have non-`import` js + - https://www.jsdelivr.com/package/npm/mermaid?tab=stats&path=dist +- https://github.com/executablebooks/rst-to-myst + - https://myst-parser.readthedocs.io/en/latest/apidocs/myst_parser/myst_parser.mdit_to_docutils.html + - https://myst-parser.readthedocs.io/en/latest/syntax/optional.html#task-lists + - For our notebook conversion + - #1392 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0209/reply_0001.md b/docs/discussions/alice_engineering_comms/0209/reply_0001.md new file mode 100644 index 0000000000..e4eba2962e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0209/reply_0001.md @@ -0,0 +1,7 @@ +## 2023-03-16 OpenSSF Town Hall + +- https://zoom.us/w/99012209258 +- Chistoph Puppe asked: "will ai be used for the industrialization of vulnerability hunting in FOSS? aks chatgpt for all signal injections in projects? :)" + - John posted to chat: [WIP: RFCv4: IETF SCITT: Use Case: OpenSSF Metrics: activitypub extensions for security.txt](https://github.com/ietf-scitt/use-cases/blob/8ab06ebf523c4cef766bddac2931eaba721d9ecd/openssf_metrics.md#openssf-metrics) +- > ![image](https://user-images.githubusercontent.com/5950433/225707933-c56410d3-d894-40c4-ba4b-f6179aa61a97.png) +- > ![image](https://user-images.githubusercontent.com/5950433/225708151-3730ab41-a287-4303-9936-47b74efb78d2.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0210/index.md b/docs/discussions/alice_engineering_comms/0210/index.md new file mode 100644 index 0000000000..3ee14a1b3b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0210/index.md @@ -0,0 +1 @@ +# 2023-03-17 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0210/reply_0000.md b/docs/discussions/alice_engineering_comms/0210/reply_0000.md new file mode 100644 index 0000000000..e42cd2524e --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0210/reply_0000.md @@ -0,0 +1,3 @@ +- https://tidyfirst.substack.com/p/fool-proof-design +- https://github.com/intel/srs/tree/main/scan-build +- https://github.com/tklengyel/drakvuf \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0211/index.md b/docs/discussions/alice_engineering_comms/0211/index.md new file mode 100644 index 0000000000..9807ea5fb3 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0211/index.md @@ -0,0 +1 @@ +# 2023-03-18 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0211/reply_0000.md b/docs/discussions/alice_engineering_comms/0211/reply_0000.md new file mode 100644 index 0000000000..35d9e8ef7b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0211/reply_0000.md @@ -0,0 +1,2 @@ +- https://gitlab.archlinux.org/archlinux/mkinitcpio/mkinitcpio/-/releases/v35- + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0006_os_decentralice.md \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0212/index.md b/docs/discussions/alice_engineering_comms/0212/index.md new file mode 100644 index 0000000000..593889377b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0212/index.md @@ -0,0 +1 @@ +# 2023-03-19 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0213/index.md b/docs/discussions/alice_engineering_comms/0213/index.md new file mode 100644 index 0000000000..927a8ed155 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0213/index.md @@ -0,0 +1 @@ +# 2023-03-20 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0213/reply_0000.md b/docs/discussions/alice_engineering_comms/0213/reply_0000.md new file mode 100644 index 0000000000..d4f0ff01aa --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0213/reply_0000.md @@ -0,0 +1,52 @@ +## 2023-03-20 @pdxjohnny Engineering Logs + +- https://github.com/microsoft/semantic-kernel/blob/main/docs/PLANNER.md#what-is-the-value-of-goal-oriented-ai + - https://hbr.org/2016/09/know-your-customers-jobs-to-be-done + - Bingo +- https://learn.microsoft.com/en-us/azure/cognitive-services/openai/chatgpt-quickstart?tabs=command-line&pivots=programming-language-python + - https://github.com/MicrosoftDocs/azure-docs +- https://www.instill.tech/docs/core-concepts/ai-task#standardise-via-vdp-protocol +- https://github.com/microsoft/semantic-kernel/blob/main/samples/apps/book-creator-webapp-react/README.md +- https://github.com/microsoft/semantic-kernel/tree/python-preview + - https://github.com/microsoft/semantic-kernel/blob/python-preview/python/FEATURE_PARITY.md +- https://www.instill.tech/docs/destination-connectors/airbyte#low-code-setup +- https://github.com/raysan5/raylib + - Vol 2 +- https://github.com/tloen/alpaca-lora +- https://github.com/MicrosoftDocs/azure-docs +- ActivityPub security.txt/md rebroadcast -> latest info on APIs available -> auto hypothesis -> evaluation of execution against strategic plans and principles + - The basic automated software development ^ loop +- Writing the wave++ + - https://github.com/34j/so-vits-svc-fork +- https://github.com/OneMoreByte/mva/blob/8a86f24e1411502243bc70404fb8646fec0202ba/mva.py#L214 + - For large file mirroring +- https://github.com/ossf/wg-vulnerability-disclosures/issues/125#issuecomment-1462563173 + - List of projects using CyloneDX format for VEX and VDR +- [OpenVEX Presentation](https://www.youtube.com/watch?v=MBn1Ph6aBxc) +- https://github.com/ossf/wg-vulnerability-disclosures/issues/124 + - https://github.com/JLLeitschuh/code-sandbox/commit/65987132b65a1c32672aa236f33569efbb04cf7e - Create GH-ROBOTS.txt +- https://github.com/ossf/scorecard/issues/1874#issuecomment-1178259870 + - Protobuf schema for scorecard results +- TODO + - [x] Fix python package release workflow + - [release.yml](https://github.com/srossross/rpmfile/blob/master/.github/workflows/release.yml) + - We use rpmfile for scanning binaries from different distros which package using the RPM format + - #789 + - https://github.com/intel/dffml/issues/595 + - We should capture the webhook event from the release upload and play with that to update pinning within downstream CD + - #906 + - [Rolling Alice: Architecting Alice: Stream of Consciousness](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md) + - [ ] https://github.com/intel/project-example-for-python/issues/3 + - [ ] Auto roll in alignment with conventional commits and https://tqrg.github.io/secom/ + - [ ] https://github.com/ossf/wg-vulnerability-disclosures/issues/124 + - [Google Doc: Specification: OpenSSF Compliant Automated Vulnerability Fix Campaign](https://docs.google.com/document/d/1_QwN7yQXWGM2tJaostIRNqyZIhVceVlIyXqCrSdC4E8/edit?disco=AAAArT7aBmI) + - > Has there been any discussion around maintaining forks and tracking the upstreaming of the fixes from the fork into the upstream? Sometimes there are no maintainers around, and a fork is the only way to go. In that case, we'd probably want to reduce the amount of forks waiting around with fixes, we don't want more than one campaign trying to submit the same fix. + - Funny, this is what got us involved in rpmfile in the first place + - Our pinning enables us to leverage these forks + - https://github.com/ossf/scorecard/blob/main/checks/evaluation/pinned_dependencies.go#L291-L295 + - Have been working on an aligned RFC (still WIP) over here: https://github.com/ietf-scitt/use-cases/pull/18. Seems like transparency services will be where we log the end assessment of is vuln/is not vuln ([2022-07-20 Identifying Security Threats WG](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3191292)) + - [ ] Semantic kernel cleanup of #1369 and #1406 into something legible + - https://github.com/mayooear/gpt4-pdf-chatbot-langchain + - Pagination for dump_discussion + - https://gist.github.com/9f3dc18f0a42d3107aaa2363331d8faa + - https://github.com/intel/dffml/blob/4dae1a3e6b6d37b81f71659599d1ddef800ac176/scripts/dump_discussion.py#L73 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0214/index.md b/docs/discussions/alice_engineering_comms/0214/index.md new file mode 100644 index 0000000000..aff8d45348 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0214/index.md @@ -0,0 +1,9 @@ +# 2023-03-21 Engineering Logs + +- Today we see alignment inbound across supply chain security and the interplanetary virtual machine + - We seek to bridge ideation to production via CI/CD pull request validation flows into deployment in a hermetic (cacheable) execution environment such as IPVM. This requires alignment across provenance formats and invocation auth formats. Essentially, if there was a valid CI/CD build, deploy it. (It doesn't necessarily require it, but it will make security much more clean from an auditability and trackability perspective and if it can all go back to JSON-LD then query is easy, which means we can feed it back into Alice's training loop and she can hypothesize and execute experiments lickity split). It also means application of policy becomes uniform across ideation and production environments, hopefully reducing policy escapes, aka lack of alignment to strategic plans and principles. This is how we get our feedback from the behavioral analysis portion of the Entity Analysis Trinity + - https://github.com/ipvm-wg/spec/pull/8 +- https://openatintel.podbean.com/e/threat-modeling-down-the-rabbit-hole/ + > I'm wondering if there's anything, if there's any angle here that we haven't covered that you wanted to make sure to mention. Speaking of, you know, different tooling that you can use, right, we have this project where we're looking at, you know, defining, when you look at the threat model of an application, you're also looking at, you know, the architecture, right, you know, what are the components in that. And so one of the things that John and I realized when we went about, you know, the tooling saga in our threat model journey is that there's a lot of different tools, right, and there's always going to be a different tool that's better for something else, right. So we began to focus on kind of this approach of more like, well, you know, what are the key components, right? And then how do we, you know, expose those to the tools and source from the tools as appropriate, right, or, you know, as context appropriate, right? So we, so we've come up with this concept, right, of this, we basically said, we want to describe the architecture. We would like to do this in an open source way. So we took the word open and we took the word architecture and we put them together and now we've got the open architecture. And so the goal here is really to say, okay, well, what does the application look like? And to speak of the angles, we've got this Entity Analysis Trinity, which basically says, you know, what, what are you trying to do? What is your threat model, right? And then what are you actually doing? And what did you write down, right? What is your code? So what is your intent at the top of the triangle, right? What is your static analysis say? And what is your sort of behavioral or dynamic analysis say, right? And so the objective here overall is to, you know, apply your static analysis methodologies, apply your dynamic analysis, right? You know, maybe that's telemetry from the field or whatever, right, to tell you about, you know, what's happening in your software, or, you know, what does it look like when it's tested under a live, you know, dynamic scanning environment, right? And how does that relate to your threat model, right? And so we can do that because we can identify the different components being tested by the different tools and map them into this, you know, open description of architecture + +![EATv0.0.2](https://user-images.githubusercontent.com/5950433/188203911-3586e1af-a1f6-434a-8a9a-a1795d7a7ca3.svg) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0214/reply_0000.md b/docs/discussions/alice_engineering_comms/0214/reply_0000.md new file mode 100644 index 0000000000..f540346a0d --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0214/reply_0000.md @@ -0,0 +1,32 @@ +## 2023-03-21 @pdxjohnny Engineering Logs + +- https://github.com/seferov/pr-lint-action +- https://github.com/GerevAI/gerev +- https://pypi.org/help/#project-release-notifications + - Can we work with them to do release notifications via ActivityPub? Should we ping Aria? + - We can of course setup rss rebroadcast + - We need the eventing (`/inbox`) because of the AI, it helps us facilitate the abstract compute architecture event loop + - `Rolling Alice: (Preface:) Transport Acquisition: Abstract Compute Architecture` +- https://github.com/in-toto/attestation/pull/164 +- https://github.com/in-toto/attestation/pull/162 +- https://github.com/in-toto/attestation/pull/152 +- https://github.com/in-toto/attestation/pull/151 +- https://github.com/in-toto/attestation/pull/129 +- https://github.com/w3c/vc-data-model/issues/1063 +- https://sourceware.org/git/?p=glibc.git;a=blob_plain;f=sysdeps/unix/sysv/linux/x86_64/clone.S;hb=HEAD +- TODO + - [ ] Plan tutorial where we injest the shared stream of consiousness and feed it into performant analysis to help Alice do online learning on the open source software lifecycle + - https://paimon.apache.org/docs/master/engines/spark3/ + - https://paimon.apache.org/docs/master/concepts/append-only-table/ + - Patch for transparency service insert? + - [ ] Document alignment with https://github.com/in-toto/attestation/blob/main/spec/predicates/link.md + - [ ] Contribute some alignment with Verifiable Credentials to bridge to the verified JSON-LD landscape + - Ideally we align to KERIVC + - This would be chadig.com + - [ ] https://github.com/in-toto/attestation/pull/162 + - [ ] https://github.com/in-toto/attestation/issues/165#issuecomment-1478420542 + - Resource descriptor would be good to look at + - Download locations, URIs - Could we just throw a VC URI there? Similar to ActivityPub exetensions for security.txt/md where we just say, there's a Contact-URL, just set it to an activitypub actor + - Would all that verification code from those DIF WGs transfer? + +![much-acc](https://user-images.githubusercontent.com/5950433/226707682-cfa8dbff-0908-4a34-8540-de729c62512f.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0214/reply_0001.md b/docs/discussions/alice_engineering_comms/0214/reply_0001.md new file mode 100644 index 0000000000..1e870dc2e0 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0214/reply_0001.md @@ -0,0 +1,90 @@ +## 2023-03-21 WECAN + +> Lost track of who said what, see recording for who said what. + +- https://us02web.zoom.us/postattendee?mn=DAWtOh4M0fbHBk3YDwrk_QYNhNn_DH7iYCTt.b-AoRt7JwG7EPfOq +- https://github.com/ucan-wg/spec/pull/132 +- Need to agree on a hash alg (sha256 seems reasonable) + - Some people think it's safer that sha3 + - How do we want to encode that? + - Do we need the CID header? + - Do we want to have a methodology that requires reencoding? + - Other WGs seem to avoid recoding + - If you're in the browser then base32 is good, if you're low in the stack binary encoding is good + - Should we just go: This hash algo, this base, call it good? + - Alan: Just had field, unique ID, up to you to decide what that is + - Minimizes numebr of things we have to agree one (sounds like the VC thing we just talked about) + - Currently everything is contenta ddressed, instead of guid we use hash of token itself + - You only end up with a colisiosn if you have th same token, which is signed + - We just want to figure out how to most qucikly decode that + - bengo: multihashes, multibases are relavent here in minimzign the base of the CID + - Irakli + - Coudl say, we expect base64 but if you see base32 it's not a big deal obviously + - IPLD version could be in DAG-JSON or DAG-CBOR + - Base encoding you can't reencode on the fly to see which ones are revoked + - Revokation is always a big problem, this is why KERI's duplicity checking is nice + - CID already has content type information in it +- Irakli: https://github.com/ipld/js-dag-ucan + - UCAN invokation spec asn capabilitesi and params + - Proofs and signature + - CID of thing would be static regardless of representation, then that coul dbe used as a key, outter layer would be distinct, would be issuers decissions if revokation of you can do this (outer layer) + +> ```typescript +> const ucan = UCAN.parse(jwt) +> ucan.issuer.did() // did:key:z6Mkk89bC3JrVqKie71YEcc5M1SMVxuCgNx6zLZ8SYJsxALi +> ``` + +- In order to check for revokation you have to pull down the inner layer + - Take the UCAN, gernate JWT payload, get each hash, check if each has been revoked + - Could take UCAN encoded as JWT, encode as DAG-JSON, but that would end up requiring revokation to transcode into all the different forms that might need to be checked to be revoked +- Core UCAN spec is JWT, lots of this work keeps leaning towards IPLD +- Easy to plug in UCAN within wherever if it's just JWT + - You could make it YAML! + - But we have standards because we want interop + - Extra structures and wrappers drifts away from interop +- The moment you support alternate ecodings, then you have to just start adding more encodings to your system + - The metadata is captured within the CID, so each system just needs to keep supporting more encoding + - Irakli points out again that this opens us up to more revokation based attacks, because you have to rencode into + - Invoation of the payload +- Is bluesky using UCANs? + - It's in the plan +- Idividual CID for the exact invoaktion isn't usually what you want to do, you just want to revoke based on the public key (PKI) associted with those verifiable credentials +- Alan with a great point: When you delegate you should deligate to one off keys + - Military wants this + - Privacy conerns mitigated + - Issuers responsibility to map key sisued to to whatever credential + - Sometimes we have to deal with whatever key is already there + - However, then the revokation can't just say revoke everything tied to one key + - Do we need revokation by key? + - It's that you're precluding the prefered practice + - Revoking by audiance key pair only revokes that one UCAN +- Key by public key or VC and not by the UCAN + - You can always find all the related if you've been indexing, you have to maintain that index if you care about revokation +- It might be useful to standaredize the revokation multihash, sign one CID to revoke it +- Only the request comes into the service provider do you have to check to see if it's revoked, does anyone else need to maintain those indexes? +- How gets to revoke the key? + - With UCAN whoever issued can revoke + - Application level you might also allow delgation for who can revoke +- Blocking the actor + - Service can do that +- Revoking specific delgation + - #1400 +- One needs to track context around why a capbiltiy was issued enables application level to say am I revoking the key or the capability +- Simpliest is revoke by CID + - However, how do you map that back to keys? THat's the recoding problem +- How can you revoke all the delgations you've given to a principle? + - How can you know the principle if every UCAN gets issued to a one off key? + - revokation index chould be keyed off public key or hash of the VC, and what context is being revoked +- Revokation by CID in spec currently assumes JWT of UCAN + - If UCAN was in non-JWT you have to translate +- Let's just pick an encoding and call that the CID +- Can we just say native link? Then encoding becomes transport problem + - Native IPLD would be DAG-CBOR for the wire format + - We want CBOR for DICE interop 🐢🐢🐢🐢🐢 + - https://github.com/ipld/js-dag-ucan/pull/4 +- For each encoding type, there could be one canonical CID + - No, don't encode as JSON + - Encode as IPLD, then we get into native links, then it's guidaded by the representation + - This sounds like that one-way converstaion on VC encoding, ref recent meeting with Sam +- Recoding to check if content id is revoked is non-ideal for some +- 7 extra chars per CID and alignment is achived \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0214/reply_0002.md b/docs/discussions/alice_engineering_comms/0214/reply_0002.md new file mode 100644 index 0000000000..9e939e8fd6 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0214/reply_0002.md @@ -0,0 +1,62 @@ +## 2018-08-21 How Not To Password + +> This was a sketch for a demo written back in 2018 and then realized, WebAuthN is a thing! +> +> We are hoping to take this a step further and tie the keys mentioned here into the software +> stack of what's running, this is the relation to our `did:keri:` workstream. We'll tie +> authentication to the soul of the software via our Entity Analysis Trinity. +> +> - [https://github.com/pdxjohnny/smartplanter](https://github.com/pdxjohnny/smartplanter/commit/f9124a8f3631cde4cd574889a163ab43a40f2804#diff-bfe9874d239014961b1ae4e89875a6155667db834a410aaaa2ebe3cf89820556R33) +> - https://github.com/pdxjohnny/smartplanteresp + +#### Registration + +1. User chooses username +2. Server validates username available +3. User device generates key pair +4. User device sends username and public key to server +5. Server stores username and public key +6. User device requests password from user to encrypt private key +7. Hash given password +8. Symmetrically encrypt private key using output of password hash function +9. Store encrypted private key on user device + +#### Login + +1. Load encrypted private key from storage +2. Request password +3. Hash given password +4. Symmetrically decrypt private key using output of password hash function +5. Sign username with loaded private key +6. Send username and signature to server +7. Server retrieve public keys associated with username +8. Verify signature using any of users confirmed keys +9. Preform 2FA challenge + *DEMO DOES NOT IMPLEMENT THIS. PRODUCTION IMPLEMENTATIONS SHOULD* + +#### Add Device + +1. New device generates key pair +2. New device sends username and public key to server +3. Preform 2FA challenge to verify user is attempting to add a device + *DEMO DOES NOT IMPLEMENT THIS. PRODUCTION IMPLEMENTATIONS SHOULD* +4. Server stores public key in pending confirmation state +5. Old device queries server for a key pending confirmation +6. Devices display fingerprint of pending key +7. User confirms fingerprints match on both devices +8. Old device notifies server of confirmation of pending key + +#### Notes + +This authentication scheme requires that a user have a previously authenticated +device present in order to authenticate a new device. The reason storing +passwords has been the de-facto method of authentication is because a user can +authenticate from anywhere at any time so long as they remember their password. +Now that we've realized 2FA is important, login requires a user to posses some +trusted device capable of answering the 2FA challenge. Hence as developers we +have assurance that users attempting to login possess a trusted device. If they +are trying to login from a new device it is likely their trusted device has +already registered a public key with the service we are attempting to login to. +Therefore, the concern that this authentication scheme might put undue burden on +users is null and void, because they always must have a trusted device to +preform 2FA. \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0215/index.md b/docs/discussions/alice_engineering_comms/0215/index.md new file mode 100644 index 0000000000..642b817d93 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0215/index.md @@ -0,0 +1,3 @@ +# 2023-03-22 Engineering Logs + +“So become those who seek death, like the dead who seek life; because what they seek is revealed to them.” \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0215/reply_0000.md b/docs/discussions/alice_engineering_comms/0215/reply_0000.md new file mode 100644 index 0000000000..0d838535b8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0215/reply_0000.md @@ -0,0 +1,52 @@ +## 2023-03-22 @pdxjohnny Engineering Logs + +- Replied to Arif + - https://groups.io/g/CycloneDX/message/234 + - > How to obtain component source URLs for maven, gradle, and npm as VCS strings in sbom( ex:git, maven, and npm repo source urls) + - This is a tricky problem. I’ve seen success using link traversal and automated triage in the past. This is an open action item for the DFFML project as we hope to provide an open source implementation which assists with this mapping process. We had a pervious implementation for mapping CVEs to source URLs, but it’s since bitrotted. + - We are hoping to restart work on this front in the coming months. I will respond here when we do. In the meantime, if you have any example SBOMs you want mapped, if you coupd please put them somewhere public (maybe a github gist?) then we can target filling for those examples first. + +https://github.com/intel/dffml/blob/830bf5af07ab9ada48f7c75a77a9ee1ef89c0964/examples/shouldi/shouldi/cvemap/cvemap/cvemap.py#L30-L199 + +- https://blog.okfn.org/2023/03/16/updating-the-open-definition-to-meet-the-challenges-of-today/?ref=openml.fyi +- https://mastodon.social/@helge@mymath.rocks/110065914387517965 + - https://mymath.rocks/objects/b59c256f-7791-4076-b108-67eba266db6c + - Server to server interactions related to private comms channels + - https://github.com/pdxjohnny/numapp/issues/1 +- https://github.com/BloopAI/bloop + - https://github.com/qdrant/qdrant + - https://github.com/quickwit-oss/tantivy +- https://github.com/nichtdax/awesome-totally-open-chatgpt + - https://github.com/madlabunimib/PyCTBN +- https://github.com/pelennor2170/NAM_models + - https://github.com/sdatkinson/neural-amp-modeler + - https://github.com/sdatkinson/NeuralAmpModelerPlugin +- https://github.com/lensterxyz/lenster +- https://github.com/chidiwilliams/buzz + - https://github.com/chidiwilliams/buzz/pull/321 + - https://github.com/chidiwilliams/buzz/blob/main/.github/workflows/ci.yml + - Issue ops flow? + +![chaos-for-the-chaos-god](https://user-images.githubusercontent.com/5950433/220794351-4611804a-ac72-47aa-8954-cdb3c10d6a5b.jpg) + +- https://github.com/thesofproject/sof/pull/7321 + - ❤️❤️❤️❤️❤️❤️❤️ + - Marc for the win again +- https://github.com/microsoft/wslg +- https://blog.rabit.pw/2020/docker-service-management-w-nsenter/#use-nsenter-to-access-the-container-intranet +- https://openatintel.podbean.com/e/software-supply-chains/ +- https://openatintel.podbean.com/e/confidential-computing +- TODO + - [x] Obscene amounts of caffeine + - [ ] Build ASI before heart attack + - [ ] Fix any of the 3 phones + - [x] Fix car + - [x] Meeting starts with people professing the importance of sleep in avoiding heart attacks + - ? https://github.com/mgaitan/sphinxcontrib-mermaid/commit/83c303d9889223e9668040f406a674967f6de7fb#diff-7b3ed02bc73dc06b7db906cf97aa91dec2b2eb21f2d92bc5caa761df5bbc168fR22-R34 ? + - [ ] OA DAG schema for Laurent and ref IPVM job spec for future + - [ ] Find that server sent events Fediverse Enhancement Proposal and replace the websocket route or make that aligned with it somehow + - [x] Transcript Threat Modeling Down the Rabbit Hole podcast + - https://github.com/ggerganov/whisper.cpp + - [ ] Automate analysis of https://github.com/trending daily similar to response to Arif + - [x] Fix TODO add vendor of choice to WIP `Rolling Alice: Architecting Alice: Transport Acquisition` + - https://github.com/intel/dffml/issues/1247#issuecomment-1341477143 \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0215/reply_0001.md b/docs/discussions/alice_engineering_comms/0215/reply_0001.md new file mode 100644 index 0000000000..066642db1a --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0215/reply_0001.md @@ -0,0 +1,4 @@ +## 2023-03-22 CVE Binary Tool Monthly Meeting + +- https://meet.google.com/msm-airt-bwp +- No one showed up \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0215/reply_0002.md b/docs/discussions/alice_engineering_comms/0215/reply_0002.md new file mode 100644 index 0000000000..4484a1d1fb --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0215/reply_0002.md @@ -0,0 +1,78 @@ +## 2023-03-22 OneAPI App CI/CD Working Session + +- Every day we surf the chaos 🏄‍♂️ +- References + - #1392 + - #1391 +- Michael developed a Sphinx site with some custom JS which has the database built via CI/CD ❤️ +- https://github.com/oneapi-src/oneAPI-samples/ +- Michael flipped the gh-pages switches +- https://oneapi-src.github.io/oneAPI-samples/ + - Blank right now +- Noticed it did the default build. We switch to deploy from gh-pages afterwards. + +![image](https://user-images.githubusercontent.com/5950433/226976386-2d2f1761-6cf2-4cfe-9bd4-7b9e9a76d827.png) + +- https://github.com/oneapi-src/oneAPI-samples/tree/app/dev +- https://github.com/oneapi-src/oneAPI-samples/pull/1457 +- https://github.com/oneapi-src/oneAPI-samples/tree/531314589f766d8f93a312855cb627cd3692a41c + - Looks like we don't have the `.nojekyll` file in the gh-pages branch +- https://github.com/oneapi-src/oneAPI-samples/blob/3ac2f6136f112db733afe0db5866e12a0fb6f4e8/.github/github-pages.yml#L67 + - We'll make a minor change here to trigger the workflow + +```console +$ git checkout app/dev +$ git pull --rebase upstream app/dev +``` + +- Weird rebase conflicts, just going to reset to upstream because we have no other + changes. + +```console +$ git reset --hard upstream/app/dev +$ sed -i -e 's/{{github.repository}}/{{github.repository}}/g' .github/github-pages.yml +$ git add .github/github-pages.yml +$ git checkout -b app/dev upstream/app/dev +$ git push -u upstream app/dev && gh pr create --fill && gh pr merge --rebase --auto +``` + +- Not seeing workflow under actions page + - https://github.com/oneapi-src/oneAPI-samples/actions + - Noticed it needs to move under workflows directory +- [Error: .github#L1](https://github.com/oneapi-src/oneAPI-samples/commit/e85d5bdc376d4234ec8778f5a7b8cb9dd21dd04c#annotation_9919858013) + - https://github.com/oneapi-src/oneAPI-samples/actions/runs/4492420124 + - > a step cannot have both the `uses` and `run` keys + - We had a copy pasta with `actions/checkout` + - https://github.com/oneapi-src/oneAPI-samples/commit/4141959c3f9328e72cf87197944af16b5d6fe832 +- https://github.com/oneapi-src/oneAPI-samples/actions/runs/4492462722 +- https://github.com/oneapi-src/oneAPI-samples/pull/1464 +- https://github.com/oneapi-src/oneAPI-samples/actions/runs/4492509212/jobs/7902491740 + - Need to add sphinx to `requirements.txt` +- https://github.com/oneapi-src/oneAPI-samples/actions/runs/4492603075 + +``` +Configuration error: +config directory doesn't contain a conf.py file (/home/runner/work/oneAPI-samples/oneAPI-samples/src) +``` + +- https://github.com/oneapi-src/oneAPI-samples/pull/1465 +- https://github.com/oneapi-src/oneAPI-samples/actions/runs/4492670680 + - Clean build! +- https://github.com/oneapi-src/oneAPI-samples/pull/1466 +- Deployed! WOOHOO! +- Adding Cascading Style Sheets + - https://github.com/oneapi-src/oneAPI-samples/commit/0fec1533300818ecdcf09e28091c5d5d116c74a7 + - https://github.com/oneapi-src/oneAPI-samples/actions/runs/4493632962 + +![image](https://user-images.githubusercontent.com/5950433/227013474-3ac6a496-5831-4557-b45a-2f988b7d4258.png) + +- CSS SUCCESS! + - Some more to do still but we have UI! +- TODO + - [x] Figure out why `.nojekyll` isn't there despite the touch being on line 70 + - We pushed to gh-pages manually and didn't run the workflow + - [x] Workflow needs to move under `.github/workflows/` + - [x] Move actions/checkout to it's own step + - [x] Fix `runs-on` typo + - [x] Add sphinx to `requirements.txt` + - [x] Modify conditional around pushing docs to `gh-pages` to the `app/dev` branch for testing \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0216/index.md b/docs/discussions/alice_engineering_comms/0216/index.md new file mode 100644 index 0000000000..4ae656dad5 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0216/index.md @@ -0,0 +1 @@ +# 2023-03-23 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0216/reply_0000.md b/docs/discussions/alice_engineering_comms/0216/reply_0000.md new file mode 100644 index 0000000000..6ead982f33 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0216/reply_0000.md @@ -0,0 +1,27 @@ +## 2023-03-23 @pdxjohnny Engineering Logs + +https://github.com/intel/dffml/blob/4dae1a3e6b6d37b81f71659599d1ddef800ac176/scripts/dump_discussion.py#L217-L247 + +- https://github.com/intel/compile-time-init-build +- https://codeql.github.com/docs/codeql-cli/manual/database-create/ + - Does this work on workflows? +- https://lucene.apache.org/pylucene/install.html +- CWE-1053: Missing Documentation for Design + - https://cwe.mitre.org/data/definitions/1053.html + - Some reasons explained here: https://github.com/ietf-scitt/charter/pull/21 +- https://github.com/ossf/wg-vulnerability-disclosures/issues/125#issuecomment-1479885225 + - OpenVEX adopted by OpenSSF! + - 🛤️🛤️🛤️🛤️🛤️ +- https://github.com/ossf/s2c2f/blob/main/specification/framework.md#appendix-relation-to-scitt +- https://github.com/ossf/great-mfa-project + - TEEs? +- https://github.com/OWASP/common-requirement-enumeration + - Ooh la la + - I spy with my little eye the flip side of VEX + - This could help with our reasons for submitting a vuln/VEX or our hypothesis + - Remember, security issues are just more interesting versions of regular issues, and therefore security requirements are just more interesting versions of regular requirements. We can piggyback all day. +- TODO + - [ ] Registry for 2nd party split out + - Can we run with as a service within the workflow? + - [ ] Patch https://github.com/soda480/wait-for-message-action for AcivityPub support behind localhost.run + - [ ] Dump discussion to gist as complete auto flow \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0217/index.md b/docs/discussions/alice_engineering_comms/0217/index.md new file mode 100644 index 0000000000..24637f120b --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0217/index.md @@ -0,0 +1 @@ +# 2023-03-24 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0217/reply_0000.md b/docs/discussions/alice_engineering_comms/0217/reply_0000.md new file mode 100644 index 0000000000..f7c21e4286 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0217/reply_0000.md @@ -0,0 +1,470 @@ +## 2023-03-23 @pdxjohnny Engineering Logs + +- [/me](https://user-images.githubusercontent.com/5950433/227560451-033923b3-52ff-4d4b-8be3-7cd14ab2a62d.jpeg) bolts out of bed in the 4 o'clock hour with a sudden urge + - Must... investigate... traceability-interop.... + - https://github.com/w3c-ccg/traceability-interop/tree/main/docs/tutorials + - https://github.com/w3c-ccg/traceability-interop/tree/main/docs/tutorials/authentication + - Have been avoiding this because postman... but whatever +- https://w3c-ccg.github.io/traceability-interop/draft/#software-supply-chain +- Discovered that typing `/` in a markdown field on GitHub opens a quick markdown formatting helper + - ![image](https://user-images.githubusercontent.com/5950433/227520416-1f285044-ef2e-4303-9575-d0ec5ea3c2e1.png) +- We're trying to bridge the current world of comms (fast becoming ActivityPub) to the new world (Web5 `did:keri:`) + - If we make progress we'll post `![knowledge-graphs-for-the-knowledge-god](https://user-images.githubusercontent.com/5950433/222981558-0b50593a-c83f-4c6c-9aff-1b553403eac7.png)` + - ![image](https://user-images.githubusercontent.com/5950433/227520859-7213f415-e371-4780-927d-01228f89873a.png) + - https://github.com/pdxjohnny/pdxjohnny.github.io/blob/3e642942d5ef1a48a3bab3c1bc65dc91182e1f7d/data/saved_replies_markdown.yaml +- https://www.postman.com/downloads/ + - > `Postman CLI\nNew!` + - I was avoiding postman because I didn't want to use a GUI, yay! + - https://pdxjohnny.github.io/dev-environment/ + - https://github.com/intel/dffml/pull/1207#discussion_r1036680987 + - The reason for the meticulousness of engineering log documentation. + We must have reproducible process for Alice to follow. + +![chaos-for-the-chaos-god](https://user-images.githubusercontent.com/5950433/220794351-4611804a-ac72-47aa-8954-cdb3c10d6a5b.jpg) + +- Chaos for the Chaos God again apparently from the postman team as we `curl | sh` + sudo + +```console +$ curl -o- "https://dl-cli.pstmn.io/install/linux64.sh" | sh +``` + +- https://learning.postman.com/docs/postman-cli/postman-cli-options/ +- https://learning.postman.com/docs/collections/using-newman-cli/command-line-integration-with-newman/ + - Okay if I had just click the first two tutorial links... +- https://nodejs.org/en/download/package-manager +- https://github.com/nodesource/distributions#debinstall + +```console +$ curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash - &&\ + sudo apt-get install -y nodejs +``` + +- Install Newman (the postman from Seinfeld - https://en.wikipedia.org/wiki/Newman_(Seinfeld) :) + +```console +$ npm install -g newman +``` + +- https://www.oauth.com/oauth2-servers/client-registration/client-id-secret/ +- https://github.com/w3c-ccg/traceability-interop/tree/main/docs/tutorials/authentication#example-run-postman-collection-from-the-command-line + - MOTHERFUCKER IT WAS RIGHT THERE AT THE BOTTOM OF THE TUTORIAL AAAAAAAAAAAAAAAHHHHHHHHHHHHHHHHHHHH + - This is what happens when one does not read and just skims... was `return -ETOOSTRESSEDOUT` at the time. + +```console +$ npx newman run ./authentication.postman_collection.json \ + --env-var CLIENT_ID=$CLIENT_ID \ + --env-var CLIENT_SECRET=$CLIENT_SECRET \ + --env-var TOKEN_AUDIENCE=$TOKEN_AUDIENCE \ + --env-var TOKEN_ENDPOINT=$TOKEN_ENDPOINT \ + --reporters cli,json +``` + +- https://www.rfc-editor.org/rfc/rfc6749.html +- https://github.com/w3c-ccg/traceability-interop/blob/7bef64ae78ead17aa4c9baaee6061da7612b6e1d/docs/tutorials/workflow-join/README.md + - This is similar to our ActivityPub setup +- https://github.com/w3c-ccg/traceability-interop/pull/491 + - Checking up on where their state of art is +- https://github.com/OpenAPITools/openapi-generator + - We'll just try to generate a server to start and then explore KERI interop and the bridge from ActivityPub methodology from [RFCv4: IETF SCITT: Use Case: OpenSSF Metrics: activitypub extensions for security.txt](https://github.com/ietf-scitt/use-cases/blob/8ab06ebf523c4cef766bddac2931eaba721d9ecd/openssf_metrics.md#openssf-metrics) + - If things go well we'll register via `Test Suite Registration` + - https://github.com/w3c-ccg/traceability-interop/blob/main/environment-setup/README.md +- We might need OAuth2 values, we may want to leverage DEX, we'll see + - https://github.com/dexidp/dex + +```console +$ git clone https://github.com/w3c-ccg/traceability-interop.git +$ cd traceability-interop +$ npm i +$ npm run serve +^C +``` + +- Seems like that just serves the repo contents + - This is like a maze of things that we try to avoid, GUIs, conda... + - #977 +- Followed the [Getting Started](https://github.com/w3c-ccg/traceability-interop/tree/main/reporting) link to the reporting directory + +```console +$ cd reporting +$ python -m venv .venv +$ . .venv/bin/activate +$ pip install -r requirements.txt +``` + +- Run the reporting + +```console +$ ./reporter.py --conformance +Processing identified reports: 4 +GS1US: Conformance Suite: 0%| | 0/4 [00:00 +Dash is running on http://127.0.0.1:8050/ + + * Serving Flask app 'reporter' + * Debug mode: off +WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead. + * Running on http://127.0.0.1:8050 +Press CTRL+C to quit +127.0.0.1 - - [24/Mar/2023 06:16:09] "GET / HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:09] "GET /_dash-component-suites/dash/deps/polyfill@7.v2_3_1m1679663645.12.1.min.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:09] "GET /_dash-component-suites/dash/dash-renderer/build/dash_renderer.v2_3_1m1679663645.min.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:09] "GET /_dash-component-suites/dash_bootstrap_components/_components/dash_bootstrap_components.v1_4_1m1679663648.min.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:09] "GET /_dash-component-suites/dash/deps/prop-types@15.v2_3_1m1679663645.7.2.min.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:09] "GET /_dash-component-suites/dash/deps/react@16.v2_3_1m1679663645.14.0.min.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:09] "GET /_dash-component-suites/dash/deps/react-dom@16.v2_3_1m1679663645.14.0.min.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:09] "GET /_dash-component-suites/dash/dcc/dash_core_components.v2_3_0m1679663645.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:09] "GET /_dash-component-suites/dash/dcc/dash_core_components-shared.v2_3_0m1679663645.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:09] "GET /_dash-component-suites/dash/dash_table/bundle.v5_1_1m1679663645.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:09] "GET /_dash-component-suites/dash/html/dash_html_components.v2_0_2m1679663645.min.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:10] "GET /_dash-layout HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:10] "GET /_dash-dependencies HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:10] "GET /_dash-component-suites/dash/dash_table/async-highlight.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:10] "GET /_dash-component-suites/dash/dash_table/async-table.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:10] "GET /_dash-component-suites/dash/dcc/async-graph.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:10] "GET /_dash-component-suites/dash/dcc/async-plotlyjs.js HTTP/1.1" 200 - +127.0.0.1 - - [24/Mar/2023 06:16:16] "GET /_favicon.ico?v=2.3.1 HTTP/1.1" 200 - +``` + +- The whole thing is javascript, lynx won't dump it + - https://fathy.fr/carbonyl + - This renders chrome to a terminal, we'll want to play with it eventually + +![image](https://user-images.githubusercontent.com/5950433/227532248-48808340-8dfb-42a4-9160-d16746326715.png) + +- Check the CI jobs + - https://github.com/w3c-ccg/traceability-interop/blob/main/.github/workflows/regression-workflow-instance-join.yml + - Finally, sanity + - https://github.com/w3c-ccg/traceability-interop/blob/7bef64ae78ead17aa4c9baaee6061da7612b6e1d/.github/workflows/interoperability-report.yml +- https://w3c-ccg.github.io/traceability-interop/openapi/#tag--Identifiers +- https://dexidp.io/docs/connectors/oauth/ + - Okay DEX helps us bridge OAuth to OIDC, I forgot, it's been a while +- https://github.com/OpenAPITools/openapi-generator/blob/master/docs/online.md + - Let's try to generate a server side API + - https://github.com/OpenAPITools/openapi-generator#to-generate-a-sample-client-library +- Wow fucking java this really is the perfectly designed maze + +```console +$ sudo apt install -y default-jre maven +$ git clone --depth=1 https://github.com/OpenAPITools/openapi-generator +$ cd openapi-generator +$ ./bin/generate-samples.sh ./bin/configs/java-okhttp-gson.yaml +``` + +- Successful generation of example +- https://github.com/OpenAPITools/openapi-generator/tree/master/samples/server/petstore/python-aiohttp +- Now to generate server, our `dffml-service-http` already uses aiohttp +- https://github.com/spec-first/connexion + - > Swagger/OpenAPI First framework for Python on top of Flask with automatic endpoint validation & OAuth2 support + - Fuck ya we're back in happy land +- Now to generate an aiohttp server based off the traceability-interop spec + - https://github.com/w3c-ccg/traceability-interop/blob/main/tests/traceability-v1.jsonld + - https://github.com/w3c-ccg/traceability-interop/blob/main/tests/valid-credential.json + - https://github.com/w3c-ccg/traceability-interop/blob/main/docs/openapi/openapi.yml + +```console +$ java -ea -server -Duser.timezone=UTC \ + -jar modules/openapi-generator-cli/target/openapi-generator-cli.jar generate \ + -g python-aiohttp \ + -i ../traceability-interop/docs/openapi/openapi.yml \ + -o python-aiohttp-traceability-interop +[main] INFO o.o.codegen.utils.ModelUtils - [deprecated] inheritance without use of 'discriminator.propertyName' has been deprecated in the 5.x release. Composed schema name: null. Title: null +[main] INFO o.o.codegen.utils.ModelUtils - [deprecated] inheritance without use of 'discriminator.propertyName' has been deprecated in the 5.x release. Composed schema name: null. Title: null +[main] INFO o.o.codegen.utils.ModelUtils - [deprecated] inheritance without use of 'discriminator.propertyName' has been deprecated in the 5.x release. Composed schema name: null. Title: Revocation List Verifiable Credential +[main] INFO o.o.codegen.utils.ModelUtils - [deprecated] inheritance without use of 'discriminator.propertyName' has been deprecated in the 5.x release. Composed schema name: null. Title: Verifiable Presentation +[main] INFO o.o.codegen.utils.ModelUtils - [deprecated] inheritance without use of 'discriminator.propertyName' has been deprecated in the 5.x release. Composed schema name: null. Title: Verifiable Credential +[main] INFO o.o.codegen.utils.ModelUtils - [deprecated] inheritance without use of 'discriminator.propertyName' has been deprecated in the 5.x release. Composed schema name: null. Title: Credential Linked Data Proof +[main] INFO o.o.codegen.utils.ModelUtils - [deprecated] inheritance without use of 'discriminator.propertyName' has been deprecated in the 5.x release. Composed schema name: null. Title: Traceable Presentation +[main] INFO o.o.codegen.utils.ModelUtils - [deprecated] inheritance without use of 'discriminator.propertyName' has been deprecated in the 5.x release. Composed schema name: null. Title: Presentation Linked Data Proof +Exception in thread "main" org.openapitools.codegen.SpecValidationException: There were issues with the specification. The option can be disabled via validateSpec (Maven/Gradle) or --skip-validate-spec (CLI). + | Error count: 5, Warning count: 0 +Errors: + -attribute components.responses.$ref is not of type `object` + -attribute components.schemas.$ref is not of type `object` + -components.schemas.Schema name $ref doesn't adhere to regular expression ^[a-zA-Z0-9\.\-_]+$ + -components.parameters.Parameter name $ref doesn't adhere to regular expression ^[a-zA-Z0-9\.\-_]+$ + -components.responses.Response key $ref doesn't adhere to regular expression ^[a-zA-Z0-9\.\-_]+$ + + at org.openapitools.codegen.config.CodegenConfigurator.toContext(CodegenConfigurator.java:620) + at org.openapitools.codegen.config.CodegenConfigurator.toClientOptInput(CodegenConfigurator.java:647) + at org.openapitools.codegen.cmd.Generate.execute(Generate.java:479) + at org.openapitools.codegen.cmd.OpenApiGeneratorCommand.run(OpenApiGeneratorCommand.java:32) + at org.openapitools.codegen.OpenAPIGenerator.main(OpenAPIGenerator.java:66) +``` + +- The generator is unhappy with the input file from traceability-interop + +**traceability-interop.git/docs/openapi/openapi.yml** + +```yaml +openapi: '3.0.0' +info: + version: 1.0.0 + title: Open API for Interoperable Traceability + description: Identifier and Credentials APIs for DID. + license: + name: Apache 2.0 + url: https://www.apache.org/licenses/LICENSE-2.0.html + +servers: + - url: https://api.did.actor + +tags: + - name: Discovery + - name: Identifiers + - name: Credentials + - name: Presentations + +paths: + /did.json: + $ref: './resources/api-configuration.yml' + + /identifiers/{did}: + $ref: './resources/did.yml' + + /credentials/issue: + $ref: './resources/credential-issuer.yml' + /credentials/status: + $ref: './resources/credential-status.yml' + /credentials/verify: + $ref: './resources/credential-verifier.yml' + /credentials/{credential-id}: + $ref: './resources/credential.yml' + + /presentations: + $ref: './resources/presentations.yml' + /presentations/prove: + $ref: './resources/presentation-prover.yml' + /presentations/verify: + $ref: './resources/presentation-verifier.yml' + /presentations/available: + $ref: './resources/presentation-available.yml' + /presentations/submissions: + $ref: './resources/presentation-submissions.yml' + +components: + securitySchemes: + OAuth2: + type: oauth2 + flows: + clientCredentials: + tokenUrl: https://example.com/oauth/token + scopes: + 'resolve:dids': Grants permission to resolve DIDs + 'issue:credentials': Grants permission issue Verifiable Credentials + 'verify:credentials': Grants permission verify Verifiable Credentials + 'read:credentials': Grants permission to get Verifiable Credentials + 'update:credentials': Grants permission to update the status of Verifiable Credentials + 'prove:presentations': Grants permission to prove Verifiable Presentations + 'verify:presentations': Grants permission verify Verifiable Presentations + 'submit:presentations': Grants permission to submit Verifiable Presentations + parameters: + $ref: './parameters/_index.yml' + schemas: + $ref: './schemas/_index.yml' + responses: + $ref: './responses/_index.yml' +``` + +- It looks like those `$ref` tags need to be resolved to their file locations + - Does the `reporter.py` already have code to do this? + +```console +$ pip install pyyaml +$ python -c 'import yaml, sys, pathlib; target = pathlib.Path(sys.argv[-1]).resolve(); root = yaml.safe_load(target.read_text()); print(root)' ../traceability-interop/docs/openapi/openapi.yml +``` + +- This will be a multi-line thing, there are many `$ref`s to load + - https://gist.github.com/pdxjohnny/ee54079831991d9155b457adb634b78b + +```console +$ (cd ~/.local/ && npm install nodemon) +$ . <(echo 'export PATH="${PATH}:${HOME}/.local/node_modules/.bin"') +$ echo 'export PATH="${PATH}:${HOME}/.local/node_modules/.bin"' | tee -a ~/.bashrc +$ . ~/.bashrc +``` + +- Ah, upon closer inspection, found the dereference command + +``` +package.json: "preserve": "npx swagger-cli bundle docs/openapi/openapi.yml -o docs/openapi/openapi.json --dereference", +``` + +- Run it from the root of the traceability-interop repo + +```console +$ npx swagger-cli bundle docs/openapi/openapi.yml -o docs/openapi/openapi.json --dereference +Created docs/openapi/openapi.json from docs/openapi/openapi.yml +``` + +- Success! + - https://gist.github.com/435c76fb52b7399a2debea6643252179 +- Now to install the package for the server we just generated and run the tests + - Then we'll see how OAuth is configured + - Then we'll try to add this new service stub as a test + - Then we'll play with DWN as a backend from the stub + +```console +$ cd python-aiohttp-traceability-interop/ +$ python -m pip install -r requirements.txt -r test-requirements.txt -e . +$ pytest +``` + +- Failures abound! + - It looks like they are all related to some YAML bug loading timestamps? +- Tried gen to fastapi but pydantic properties with `-` in them were generated + - https://github.com/OpenAPITools/openapi-generator/issues/11610 +- https://gist.github.com/enten/c4f9e35279c1278844c3 + - This looks nice for our 2nd party auto split out +- https://github.com/ossf/wg-vulnerability-disclosures/issues/94#issuecomment-1483184591 + - Not sure if this is still active, but have been working on a methodology as part of this SCITT use case: [WIP: RFCv4: IETF SCITT: Use Case: OpenSSF Metrics: activitypub extensions for security.txt](https://github.com/ietf-scitt/use-cases/blob/748597b37401bd59512bfedc80158b109eadda9b/openssf_metrics.md#openssf-metrics). In this use case we're looking at OpenVEX as the format which we could use to submit the vuln. We'd use the description or evolution of the linked data format there to reference a SARIF or other standard format document or set of instances of formats which would act as the justification, with the status set to affected. Effectively proposing that this ad-hoc generated CVE-ID affects the product. Perhaps a schema for the example form above is needed / could be part of the vocabulary involved? + - [https://github.com/intel/dffml/blob/alice/schema/security/vuln/proposed/0.0.0.schema.json](https://github.com/intel/dffml/blob/9303cbee00690d3b7ba3fb673d5402a3965cfdc0/schema/security/vuln/proposed/0.0.0.schema.json) + +```yaml +$id: https://github.com/intel/dffml/raw/main/schema/security/vuln/proposed/0.0.0.schema.json +$schema: https://json-schema.org/draft/2020-12/schema +definitions: + affected_version: + description: What Product, OS, stack and versions have you tested against? TODO + regex for PURLs + type: string + entity: + description: Who done it + properties: + name: + description: Whooooo areeeeee youuuuuu? + type: string + type: object + exploitation_technique: + description: How can did you break it? + enum: + - local + - remote + type: string + mitigation: + description: Any suggestions on how to fix it? + type: string + poc: + description: POC Code and/or steps to reproduce (can attach a file, base64 encode + a zip or tar for now if a repo or more than one file) + type: string + proposed_vuln: + properties: + affected_versions: + items: + $ref: '#/definitions/affected_version' + type: array + credits: + items: + $ref: '#/definitions/entity' + type: array + description: + description: "Short, yet descriptive overview of what you\u2019ve found" + type: string + exploitation_techniques: + items: + $ref: '#/definitions/exploitation_technique' + type: array + mitigation: + $ref: '#/definitions/mitigation' + poc: + $ref: '#/definitions/poc' + timeline: + $ref: '#/definitions/timeline' + type: object + timeline: + description: What are we thinking the order of events related to responsible discloure + is? + items: + $ref: '#/definitions/timeline_item' + type: array + timeline_item: + description: Something is happneing! + properties: + date: + description: When is this timeline itme happening. TODO date regex. TODO non-linear + time conversion helpers + type: string + description: + description: What's happening at this point in time? + type: string + parties: + description: Who's involved in this timeline item? + items: + $ref: '#/definitions/entity' + type: array + type: object +properties: + '@context': + items: + type: string + type: array + include: + items: + $ref: '#/definitions/proposed_vuln' + type: array +``` + +```console +$ python -m pip install python-jwt pyyaml +$ python -c 'import sys, python_jwt, yaml; print(yaml.dump(list(python_jwt.process_jwt(sys.argv[-1]))))' +``` + +- https://chromium.googlesource.com/chromium/src/+/main/docs/contributing.md#running-automated-tests + - Ref yestredays codeql on workflows, auto ploicy based approval or workload execution for 2nd party + - 3rd party would be strickter policy for promotion + - 2nd party involves same oras.land as first party + - Since they are support level 1 +- Checking for what kinds of BOMs exist within traceability-vocab currently + +```console +$ curl -sfL https://github.com/w3c-ccg/traceability-vocab/raw/5221dec607706deabfbf2b5b9179c03088ede79c/docs/credentials-with-undefined-terms.json | grep -i billof + "type": "SoftwareBillofMaterialsCredential", + "type": "MultiModalBillOfLadingCredential", + "type": "MasterBillOfLadingCredential", + "type": "HouseBillOfLadingCredential", + "type": "BillOfLadingCredential", +``` + +- Looks like vulns still need to be added to traceability vocab + +```console +$ curl -sfL https://github.com/w3c-ccg/traceability-vocab/raw/5221dec607706deabfbf2b5b9179c03088ede79c/docs/credentials-with-undefined-terms.json | grep -i vuln +``` + +- https://github.com/w3c-ccg/traceability-vocab/issues/596 + - Here's a possible way for us to bridge from ActivityPub status IDs +- https://rdflib.readthedocs.io/en/stable/security_considerations.html#python-runtime-audit-hooks + +```console +$ cd schema/security/vuln/proposed/ +$ python -c "import sys, pathlib, json, yaml; pathlib.Path(sys.argv[-1]).write_text(json.dumps(yaml.safe_load(pathlib.Path(sys.argv[-2]).read_text()), indent=4) + '\n')" example.0.0.0.yaml example.0.0.0.json +$ jsonschema --instance example.0.0.0.json 0.0.0.schema.json +$ echo $? +0 +``` + +- TODO + - [x] Find example cvemap code for Arif + - [ ] Send email to mailing list + - Mention it works with https://github.com/intel/cve-bin-tool/blob/main/cve_bin_tool/cvedb.py + - https://github.com/intel/cve-bin-tool/pull/277 + - https://github.com/intel/cve-bin-tool/pull/285 + - [x] Tell Katherine today's the day we're playing with traceability interop + - [ ] Add proposed vuln to https://github.com/w3c-ccg/traceability-vocab + - [ ] Update Manifest ADR to reference check-jsonschema + - https://github.com/python-jsonschema/check-jsonschema + - [ ] `await reponse_from("Ariadne")` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0218/index.md b/docs/discussions/alice_engineering_comms/0218/index.md new file mode 100644 index 0000000000..1336ad1146 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0218/index.md @@ -0,0 +1 @@ +# 2023-03-25 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0218/reply_0000.md b/docs/discussions/alice_engineering_comms/0218/reply_0000.md new file mode 100644 index 0000000000..8d00e455aa --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0218/reply_0000.md @@ -0,0 +1,93 @@ +## 2023-03-25 @pdxjohnny Engineering Logs + +- https://mastodon.social/@pdxjohnny/110084611470680226 + - Phone seems be charging while plugged into monitor + - [AliceIsHereLibrem5](https://user-images.githubusercontent.com/5950433/227727820-2ec703d9-7ea4-4b5f-801e-bb6d871328ea.jpeg) + - Hoping to check power stats, drain, etc. + - Ideally this would be a package, haven't checked, curious to static build +- https://wiki.archlinux.org/title/Powertop +- https://github.com/fenrus75/powertop + - Might contribute CI/CD back here for others who want static builds for distros + which don't package powertop. With our security.md/txt AcivityPub methodology + we've setup the dependency trees of projects to enable notifications which would + help ensure that static builds are kept up to date. + - https://blog.madkoo.net/2022/09/09/Github-IssueOps/ + - We can allow others who are not members of the repo to re-trigger builds (or their + relay'd event stream from our dependencies) via IssueOps. Only members can + workflow_dispatch. +- https://github.com/pdxjohnny/static-builds/actions/runs/4519894447/jobs/7960651645 + +``` +checking for libtracefs... no +configure: error: libtracefs is required but was not found +checking for library containing tracefs_local_events... no +``` + +- Installing libtracefs-dev for some reason didn't help :( + - https://github.com/fenrus75/powertop/pull/122#issuecomment-1464898950 + - > racefs_event_file_exists() is missing in the [Debian libtracefs 1.0.2-1 package](https://sources.debian.org/src/libtracefs/). + - https://wiki.debian.org/DebianTesting + - We need to enable bookworm which is the next version of debian, but we're on ubuntu, we can have github actions run us on a debain container + - https://docs.github.com/en/actions/using-jobs/running-jobs-in-a-container +- https://github.com/pdxjohnny/static-builds/actions/runs/4519990144 + - https://github.com/github/codeql-action/issues/572 + - https://sjramblings.io/github-actions-resource-not-accessible-by-integration?x-host=sjramblings.io + +``` +Error: Unhandled error: HttpError: Resource not accessible by integration +``` + +- https://github.com/Foxboron/sbctl/releases/tag/0.11 +- https://social.coop/@J12t/110079945657098806 +- https://getutm.app + - Sent to Tom +- https://github.com/rsc/2fa + - Need static builds of this too, cgo=0 tags netgo +- https://rhodesmill.org/brandon/2009/commands-with-comma/ + - context local commamd pdefixed ith comma +- https://github.com/github/codeql-action/issues/572#issuecomment-966291195 +- https://docs.github.com/en/actions/using-jobs/assigning-permissions-to-jobs +- https://github.com/newreleasesio/cli-go#listing-available-notification-channels + - This has webhook support + +```console +$ curl -sfLO https://github.com/pdxjohnny/static-builds/releases/download/tmux/tmux +$ file tmux +tmux: ELF 64-bit LSB executable, x86-64, version 1 (GNU/Linux), statically linked, BuildID[sha1]=cd4960b3793f59321dba13c6525617ff83f0fbb4, for GNU/Linux 3.2.0, with debug_info, not stripped +$ curl -sfLO https://github.com/pdxjohnny/static-builds/releases/download/powertop/powertop +$ file powertop +powertop: ELF 64-bit LSB pie executable, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=19912d09dfd14b2b18c9c0db010e06270915e416, for GNU/Linux 3.2.0, with debug_info, not stripped +``` + +- https://goreleaser.com/ci/actions/ +- https://github.com/pdxjohnny/static-builds/blob/412070805cc81deb91921a1785e2a448130b0309/.github/workflows/2fa.yml +- https://github.com/pdxjohnny/static-builds/releases/tag/v1.2.0-1-g2479737 + +```console +$ cd ~/Downloads/ +$ mkdir 2fa +$ cd 2fa +$ curl -sfL https://github.com/pdxjohnny/static-builds/releases/download/v1.2.0-1-g2479737/static-builds_1.2.0-1-g2479737_linux_arm64.tar.gz | tar xvz +LICENSE +README.md +static-builds +$ file ./static-builds +./static-builds: ELF 64-bit LSB executable, ARM aarch64, version 1 (SYSV), statically linked, Go BuildID=7SDPJG9GMNSWWh9yztI-/IyOawSUR8TC433dkmBdo/-WpXYH6ArEaytFRcP3sA/WyhpQ58888T7HZT92Z8I, stripped +$ mv ~/.local/bin/static-builds ~/.local/bin/2fa +$ 2fa -h +usage: + 2fa -add [-7] [-8] [-hotp] keyname + 2fa -list + 2fa [-clip] keyname +``` + +- TODO + - [x] Fix tmux build + - [x] powertop build + - [ ] Fix static build + - LOL just remembered this phone is ARM not x86 + - `¯\_(ツ)_/¯` + - [x] 2fa aarch64 build + - [ ] Dataflow in rust + - [ ] https://github.com/RustPython/RustPython incremental + - [ ] cve-bin-tool scan to get SBOM of static build -> newreleasesio webhook -> ActivityPub mirror -> ActivityPub follow as Code -> issue ops retriggers \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0219/index.md b/docs/discussions/alice_engineering_comms/0219/index.md new file mode 100644 index 0000000000..8781039451 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0219/index.md @@ -0,0 +1 @@ +# 2023-03-26 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0219/reply_0000.md b/docs/discussions/alice_engineering_comms/0219/reply_0000.md new file mode 100644 index 0000000000..c9c7d9d5f9 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0219/reply_0000.md @@ -0,0 +1,21 @@ +## 2023-03-26 @pdxjohnny Engineering Logs + +- https://github.com/CycloneDX/specification/pull/200 +- https://github.com/CycloneDX/specification/pull/199 +- https://github.com/CycloneDX/specification/pull/194 + - ❤️❤️❤️ FUCK YES FUCK YES FUCK YES MOTHERFUCKER FUCK YEAH!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Rust orchestrator? + - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md#volume-0-chapter-2-shes-ariving-when + - https://github.com/intel/dffml/issues/44 + - I wonder if this (authentication:true) might be a place where the https://identity.foundation/presentation-exchange/ spec could be used to specify the requirements around authentication. We could imagine a world where scorecard probes are hosted as microservices executed as an IPVM affect. Just spitballin here. In our hypothetical example we’d be looking at a CycloneDX dataflow of scorecard itself, attempting to execute the probes via IPVM. + - Pulled Brooklyn and Laurent into the same thread. SBOM, OpenSSF, IPVM. We could even invoke probes via vuln submitted schema / protobuf + - If we get good discussion then such-alignment doge meme +- https://github.com/intel/dffml/issues/1421#issuecomment-1484110108 + - Mentioned we should serialize to CycloneDX format + - What a wonderful day :) !!! + +![chaos-for-the-chaos-god](https://user-images.githubusercontent.com/5950433/220794351-4611804a-ac72-47aa-8954-cdb3c10d6a5b.jpg) + +- https://github.com/CycloneDX/specification/pull/194#discussion_r1148577288 + - Steve recommends opening a new issue for further discussion. Ideally we discuss there and then align across those specs and in-toto. We need to sync with the Confidential Computing Consortium folks as well. + +![such-alignment](https://user-images.githubusercontent.com/5950433/226707682-cfa8dbff-0908-4a34-8540-de729c62512f.png) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0220/index.md b/docs/discussions/alice_engineering_comms/0220/index.md new file mode 100644 index 0000000000..1a3637b580 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0220/index.md @@ -0,0 +1 @@ +# 2023-03-27 Engineering Logs diff --git a/docs/discussions/alice_engineering_comms/0220/reply_0000.md b/docs/discussions/alice_engineering_comms/0220/reply_0000.md new file mode 100644 index 0000000000..1cb5d28aea --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0220/reply_0000.md @@ -0,0 +1,304 @@ +## 2023-03-27 @pdxjohnny Engineering Logs + +- Pinning git repo update + +```console +$ export NEW_HASH="$(git log -n 1 --pretty=format:%H)"; sed -i -r -e "s#\"[A-Fa-f0-9]{40}\"#\"${NEW_HASH}\"#g" +$ export TARGET_OWNER=srossross \ + && export TARGET_REPO=rpmfile \ + && export TARGET_TAG=1.1.1 \ + && export TARGET_COMMIT=$(git ls-remote --tags https://github.com/${TARGET_OWNER}/${TARGET_REPO} "refs/tags/${TARGET_TAG}" | awk '{print $1}') \ + && export TARGET_VENDOR_OWNER=intel \ + && export TARGET_VENDOR_REPO=dffml \ + && export TARGET_VENDOR_BRANCH="vendored.com.github.${TARGET_OWNER}.${TARGET_REPO}.${TARGET_COMMIT}" \ + && set -x \ + && sed -i -e "s/${TARGET_OWNER}\/${TARGET_REPO}@${TARGET_TAG}/${TARGET_VENDOR_OWNER}\/${TARGET_VENDOR_REPO}@${TARGET_VENDOR_BRANCH}/g" $(git grep "${TARGET_OWNER}/${TARGET_REPO}@${TARGET_TAG}" | awk '{print $1}' | sed -e 's/://g' | sort | uniq) \ + && git commit -sam "Vendor ${TARGET_OWNER}/${TARGET_REPO}" \ + && export TARGET_DIR=$(mktemp -d ) \ + && export TARGET_REPO_URL=https://github.com/${TARGET_OWNER}/${TARGET_REPO} \ + && export VENDOR_REPO_URL=https://github.com/${TARGET_VENDOR_OWNER}/${TARGET_VENDOR_REPO} \ + && export TARGET_COMMIT=$TARGET_COMMIT\ + && mkdir -p "${TARGET_DIR}" \ + && cd "${TARGET_DIR}" \ + && git init \ + && git remote add origin "${TARGET_REPO_URL}" \ + && git fetch origin "${TARGET_COMMIT}" --depth 1 \ + && git reset --hard "${TARGET_COMMIT}" \ + && git remote set-url origin "${VENDOR_REPO_URL}" \ + && git push origin "HEAD:${TARGET_VENDOR_BRANCH}" \ + && cd - \ + && set +x +``` + +- https://github.com/guacsec/guac/issues/594 + - > In the CycloneDX PR (https://github.com/CycloneDX/cyclonedx-maven-plugin/pull/306), the proposal is to add a [hash to the reference which acts as a merkle tree of PURLs](https://github.com/CycloneDX/cyclonedx-maven-plugin/blob/1ebfae540c43aa0341e034cba12c575de9c72e80/src/main/java/org/cyclonedx/maven/DefaultProjectDependenciesConverter.java#L263-L298) which a pkg depends on. + > + > In GUAC, we can take a similar approach where we can perform a hash on descendants of a package when parsing the SBOMs. And express them in our pkg data model as a qualifier (which are used to express specific instances of a library). This can be done via taking the serialization of GUAC pkg predicates for descendants and use that hash as a qualifier via a merkle tree hash by pkg serialization lexical order. + > + > The ideal situation is that the Java ecosystem would encode a way to differentiate between such instances or provide the identifiers to do this analysis. Possibly as a qualifier on a PURL. + - Looks like the GUAC folks are tackling the dependency DAG problem over here +- https://github.com/CycloneDX/specification/issues/192#issuecomment-1485405123 + - https://github.com/in-toto/attestation/issues/165 +- https://github.com/CycloneDX/specification/issues/201 +- https://github.com/ipvm-wg/spec/pull/8 + - > IPVM provides a deterministic-by-default, content addressed execution environment. Computation MAY be run locally or remotely. While local operation has zero latency, there are many cases where remote exection is desirable: access to large data, faster processors, trusted execution environments, or access to specialized hardware, among others. + - ❤️ This helps us with our hermetic / arbitrary granularity cache-able builds +- https://huggingface.co/EleutherAI/gpt-j-6B +- https://github.com/BlinkDL/ChatRWKV/blob/main/v2/chat.py +- https://github.com/sahil280114/codealpaca +- https://github.com/neonbjb/tortoise-tts + - Text to speech for the response half of of Writing the Wave + - > The original colab no longer works by a combination of Google's tendency to forward-break things and Python's package management system. I do not intend to keep fixing it so it has been removed. Apologies! + - Yeah... bane of my existence... hence the pinning stuff and the eventing for it and the CI/CD and the AI... +- https://github.com/Picsart-AI-Research/Text2Video-Zero +- https://github.com/RDFLib/rdflib +- https://forgefed.org/spec/#repository-forking +- https://codeberg.org/ForgeFed/ForgeFed/src/branch/main/doc/EXAMPLE_WORKFLOWS.md +- https://codeberg.org/ForgeFed/ForgeFed/src/branch/main/doc/ + - > Distributed version control systems (VCS) were created to allow maximal flexibility of project management structures and code hosting, in contrast to the client-server version control systems that were most widely used at the time, which denote one replica as the canonical master source. Existing project management / code hosting websites (aka: forges) soon began supporting these, and some new ones sprung up as well; but even the new ones were modeled upon the centralized "hub" paradigm (star topology, in networking lingo), with a single canonical "upstream" parent replica, and all other replicas implicitly and permanently designated as "downstream" child replicas (aka: forks). This type of website well serves the traditional purpose of facilitating release distribution, collaboration, and end-user participation; but at the expense of re-centralizing the naturally distributed VCS. + > + > The goal of the ForgeFed project is to support the familiar collaborative features of centralized web forges with a decentralized, federated design that, by fully embracing the mostly forgotten merits distributed VCS, does not rely on a single authoritative central host, does not impose a hierarchical master/fork collaboration structure, and can be self-hosted by anyone; with all such independent peers cooperating to form a larger logical network of inter-operable and correlated services. +- https://github.com/renovatebot/renovate +- https://docs.renovatebot.com/modules/platform/gitea/ + - Let's hook this up to our commit stream and have it bump active PRs against the ones in their virtual branch set based off federated CI results + - Our policy engine with ability to provide per-pull-request depedency-links style alternate deps will help use decide if we should create pull requests on active pull request to update relevant manifests when we have a multi-branch CR0/4 style setup across a poly repo federated set +- https://git.mastodont.cat/spla/gitcat + - For running mastodon for truly federated non-single users servers such as activitypubstarterkit +- https://forgejo.dev/forgejo.dev/infrastructure-as-code/ +- https://codeberg.org/forgejo/-/packages/container/forgejo/1.19.0-2-rootless +- https://code.forgejo.org/earl-warren/setup-forgejo-release/commit/89b6ae4da602c35e4d98b986fe98251e826e59c4 + - We need to enable some kind of per-branch upload-artifact style releases so that pull requests can grab built packages from other pull requests in their active poly repo virtual branch setup +- https://forgejo.org/docs/latest/admin/database-preparation/ + - https://github.com/intel/dffml/blob/d6631495b3d6c567de0841580ee63b625c571b4d/source/mysql/dffml_source_mysql/util/mysql_docker.py + - https://github.com/go-gitea/gitea/issues/10828 + - We'll hold off on TLS until this issue is closed +- https://docs.gitea.io/en-us/install-with-docker/ +- https://forgejo.org/docs/latest/user/ +- https://forgejo.org/docs/latest/admin/config-cheat-sheet/ +- https://f3.forgefriends.org/schemas/index.html#release-asset + - This might be all we need +- https://f3.forgefriends.org/schemas/index.html#review-comment +- https://lab.forgefriends.org/friendlyforgeformat/f3-schemas/-/blob/main/pullrequest.json +- https://lab.forgefriends.org/friendlyforgeformat/f3-schemas/-/blob/main/pullrequestbranch.json + - We should see about referencing pull request objects or branch objects as vuln proof of concepts + +```console +$ cd examples/tutorials/rolling_alice/federated_forge/alice_and_bob +$ docker-compose up +``` + +- Create initial config + +```yaml +app_name: 'Forgejo: Beyond coding. We forge.' +app_url: http://127.0.0.1:2000/ +charset: utf8 +db_host: localhost:3306 +db_name: gitea +db_path: /var/lib/gitea/data/gitea.db +db_type: sqlite3 +db_user: root +default_allow_create_organization: 'on' +default_enable_timetracking: 'on' +domain: 127.0.0.1 +enable_federated_avatar: 'on' +enable_open_id_sign_in: 'on' +enable_open_id_sign_up: 'on' +http_port: '3000' +lfs_root_path: /var/lib/gitea/git/lfs +log_root_path: /var/lib/gitea/data/log +no_reply_address: noreply.localhost +password_algorithm: pbkdf2_hi +repo_root_path: /var/lib/gitea/git/repositories +run_user: git +ssh_port: '2022' +ssl_mode: disable +``` + +- https://docs.python.org/3/library/urllib.parse.html#module-urllib.parse +- Convert to URL params + +```console +$ echo "${ALICE_DATA_RAW_INIT_FORGE}" +db_type=sqlite3&db_host=localhost%3A3306&db_user=root&db_passwd=&db_name=gitea&ssl_mode=disable&db_schema=&charset=utf8&db_path=%2Fvar%2Flib%2Fgitea%2Fdata%2Fgitea.db&app_name=Forgejo%3A+Beyond+coding.+We+forge.&repo_root_path=%2Fvar%2Flib%2Fgitea%2Fgit%2Frepositories&lfs_root_path=%2Fvar%2Flib%2Fgitea%2Fgit%2Flfs&run_user=git&domain=127.0.0.1&ssh_port=2022&http_port=3000&app_url=http%3A%2F%2F127.0.0.1%3A2000%2F&log_root_path=%2Fvar%2Flib%2Fgitea%2Fdata%2Flog&smtp_addr=&smtp_port=&smtp_from=&smtp_user=&smtp_passwd=&enable_federated_avatar=on&enable_open_id_sign_in=on&enable_open_id_sign_up=on&default_allow_create_organization=on&default_enable_timetracking=on&no_reply_address=noreply.localhost&password_algorithm=pbkdf2_hi&admin_name=&admin_passwd=&admin_confirm_passwd=&admin_email= +$ curl 'http://127.0.0.1:2000/' \ + -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8' \ + -H 'Accept-Language: en-US,en' \ + -H 'Cache-Control: max-age=0' \ + -H 'Connection: keep-alive' \ + -H 'Content-Type: application/x-www-form-urlencoded' \ + -H 'Cookie: lang=en-US; _csrf=VjZKcTdlMK7zjeMnbayeSuOzQi46MTY3OTk3MzYxOTc2NTgzNTY3NA; i_like_gitea=d5249768265f875d' \ + -H 'Origin: null' \ + -H 'Sec-Fetch-Dest: document' \ + -H 'Sec-Fetch-Mode: navigate' \ + -H 'Sec-Fetch-Site: same-origin' \ + -H 'Sec-Fetch-User: ?1' \ + -H 'Sec-GPC: 1' \ + -H 'Upgrade-Insecure-Requests: 1' \ + -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36' \ + -H 'sec-ch-ua: "Chromium";v="110", "Not A(Brand";v="24", "Brave";v="110"' \ + -H 'sec-ch-ua-mobile: ?0' \ + -H 'sec-ch-ua-platform: "Linux"' \ + --data-raw "${ALICE_DATA_RAW_INIT_FORGE}" \ + --compressed +$ python -c 'import sys, urllib.parse, yaml; print(yaml.dump({key: value for key, value in urllib.parse.parse_qsl(sys.argv[-1])}))' "${ALICE_DATA_RAW_INIT_FORGE}" +$ curl 'http://127.0.0.1:2000/user/sign_up' \ + -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8' \ + -H 'Accept-Language: en-US,en' \ + -H 'Cache-Control: max-age=0' \ + -H 'Connection: keep-alive' \ + -H 'Content-Type: application/x-www-form-urlencoded' \ + -H 'Cookie: lang=en-US; _csrf=VjZKcTdlMK7zjeMnbayeSuOzQi46MTY3OTk3MzYxOTc2NTgzNTY3NA; i_like_gitea=d5249768265f875d' \ + -H 'Origin: null' \ + -H 'Sec-Fetch-Dest: document' \ + -H 'Sec-Fetch-Mode: navigate' \ + -H 'Sec-Fetch-Site: same-origin' \ + -H 'Sec-Fetch-User: ?1' \ + -H 'Sec-GPC: 1' \ + -H 'Upgrade-Insecure-Requests: 1' \ + -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36' \ + -H 'sec-ch-ua: "Chromium";v="110", "Not A(Brand";v="24", "Brave";v="110"' \ + -H 'sec-ch-ua-mobile: ?0' \ + -H 'sec-ch-ua-platform: "Linux"' \ + --data-raw '_csrf=$XXS_CSRF_TOKEN&user_name=alice&email=alice%40chadig.com&password=maryisgod&retype=maryisgod' \ + --compressed +``` + +- https://gist.github.com/pdxjohnny/f6fe1a39bd4e66e7d0c6e7802872d3b5#file-download-py-L63-L78 + - Maybe we can just disable CSRF to avoid having to double request every time + - Update: It doesn't look like there is a way to do this across multiple handlers without recompiling + +[![use-the-source](https://img.shields.io/badge/use%20the-source-blueviolet)](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md#use-the-source-) + +```console +$ git grep -i disablecsrf +modules/context/auth.go: DisableCSRF bool +modules/context/auth.go: if !options.SignOutRequired && !options.DisableCSRF && ctx.Req.Method == "POST" { +routers/web/web.go: ignSignInAndCsrf := context.Toggle(&context.ToggleOptions{DisableCSRF: true}) +``` + +- We need to enable federation to see messages fly between `/inbox` endpoints + - https://github.com/go-gitea/gitea/blob/8df1b4bd699897264c60da7ce982b09cee57f345/custom/conf/app.example.ini#L2442-L2469 +- Maybe we can do it all within an integration test? + +```console +$ git log -n 1 +commit 95e4f16899cb85b68657fcc66da11cf4c38d1d7e (HEAD -> forgejo, origin/forgejo, origin/HEAD) +Merge: 5100a777a 70afc6a29 +Author: Loïc Dachary +Date: Sun Mar 26 21:02:12 2023 +0200 + + Merge remote-tracking branch 'forgejo/forgejo-development' into forgejo +$ git grep -i federation +CHANGELOG.md: * User keypairs and HTTP signatures for ActivityPub federation using go-ap (#19133) +CHANGELOG.md:* FEDERATION +CHANGELOG.md: * Create pub/priv keypair for federation (#17071) +CHANGELOG.md: * Add nodeinfo endpoint for federation purposes (#16953) +CONTRIBUTING/WORKFLOW.md:### [Federation](https://codeberg.org/forgejo/forgejo/issues?labels=79349) +CONTRIBUTING/WORKFLOW.md:* [forgejo-federation](https://codeberg.org/forgejo/forgejo/src/branch/forgejo-federation) based on [forgejo-development](https://codeberg.org/forgejo/forgejo/src/branch/forgejo-development) +CONTRIBUTING/WORKFLOW.md: Federation support for Forgejo +README.md:- Federation: (WIP) We are actively working to connect software forges with each other through **ActivityPub**, +RELEASE-NOTES.md: * User keypairs and HTTP signatures for ActivityPub federation using go-ap (https://github.com/go-gitea/gitea/pull/19133) +custom/conf/app.example.ini:;[federation] +custom/conf/app.example.ini:;; Enable/Disable federation capabilities +custom/conf/app.example.ini:;; Enable/Disable user statistics for nodeinfo if federation is enabled +custom/conf/app.example.ini:;; Maximum federation request and response size (MB) +custom/conf/app.example.ini:;; WARNING: Changing the settings below can break federation. +custom/conf/app.example.ini:;; GET headers for federation requests +custom/conf/app.example.ini:;; POST headers for federation requests +docs/content/doc/administration/config-cheat-sheet.en-us.md:## Federation (`federation`) +docs/content/doc/administration/config-cheat-sheet.en-us.md:- `ENABLED`: **false**: Enable/Disable federation capabilities +docs/content/doc/administration/config-cheat-sheet.en-us.md:- `SHARE_USER_STATISTICS`: **true**: Enable/Disable user statistics for nodeinfo if federation is enabled +docs/content/doc/administration/config-cheat-sheet.en-us.md:- `MAX_SIZE`: **4**: Maximum federation request and response size (MB) +docs/content/doc/administration/config-cheat-sheet.en-us.md: WARNING: Changing the settings below can break federation. +docs/content/doc/administration/config-cheat-sheet.en-us.md:- `GET_HEADERS`: **(request-target), Date**: GET headers for federation requests +docs/content/doc/administration/config-cheat-sheet.en-us.md:- `POST_HEADERS`: **(request-target), Date, Digest**: POST headers for federation requests +modules/activitypub/client.go: if err = containsRequiredHTTPHeaders(http.MethodGet, setting.Federation.GetHeaders); err != nil { +modules/activitypub/client.go: } else if err = containsRequiredHTTPHeaders(http.MethodPost, setting.Federation.PostHeaders); err != nil { +modules/activitypub/client.go: digestAlg: httpsig.DigestAlgorithm(setting.Federation.DigestAlgorithm), +modules/activitypub/client.go: getHeaders: setting.Federation.GetHeaders, +modules/activitypub/client.go: postHeaders: setting.Federation.PostHeaders, +modules/activitypub/client_test.go: assert.Regexp(t, regexp.MustCompile("^"+setting.Federation.DigestAlgorithm), r.Header.Get("Digest")) +modules/setting/federation.go:// Federation settings +modules/setting/federation.go: Federation = struct { +modules/setting/federation.go:func loadFederationFrom(rootCfg ConfigProvider) { +modules/setting/federation.go: if err := rootCfg.Section("federation").MapTo(&Federation); err != nil { +modules/setting/federation.go: log.Fatal("Failed to map Federation settings: %v", err) +modules/setting/federation.go: } else if !httpsig.IsSupportedDigestAlgorithm(Federation.DigestAlgorithm) { +modules/setting/federation.go: log.Fatal("unsupported digest algorithm: %s", Federation.DigestAlgorithm) +modules/setting/federation.go: Federation.MaxSize = 1 << 20 * Federation.MaxSize +modules/setting/federation.go: HttpsigAlgs = make([]httpsig.Algorithm, len(Federation.Algorithms)) +modules/setting/federation.go: for i, alg := range Federation.Algorithms { +modules/setting/setting.go: loadFederationFrom(CfgProvider) +routers/api/v1/activitypub/reqsignature.go: b, err = io.ReadAll(io.LimitReader(resp.Body, setting.Federation.MaxSize)) +routers/api/v1/activitypub/reqsignature.go: algo := httpsig.Algorithm(setting.Federation.Algorithms[0]) +routers/api/v1/api.go: if setting.Federation.Enabled { +routers/api/v1/misc/nodeinfo.go:// NodeInfo returns the NodeInfo for the Gitea instance to allow for federation +routers/api/v1/misc/nodeinfo.go: if setting.Federation.ShareUserStatistics { +routers/web/web.go: federationEnabled := func(ctx *context.Context) { +routers/web/web.go: if !setting.Federation.Enabled { +routers/web/web.go: }, federationEnabled) +tests/integration/api_activitypub_person_test.go: setting.Federation.Enabled = true +tests/integration/api_activitypub_person_test.go: setting.Federation.Enabled = false +tests/integration/api_activitypub_person_test.go: setting.Federation.Enabled = true +tests/integration/api_activitypub_person_test.go: setting.Federation.Enabled = false +tests/integration/api_activitypub_person_test.go: setting.Federation.Enabled = true +tests/integration/api_activitypub_person_test.go: setting.Federation.Enabled = false +tests/integration/api_nodeinfo_test.go: setting.Federation.Enabled = true +tests/integration/api_nodeinfo_test.go: setting.Federation.Enabled = false +tests/integration/webfinger_test.go: setting.Federation.Enabled = true +tests/integration/webfinger_test.go: setting.Federation.Enabled = false +web_src/fomantic/build/semantic.css:i.icon.trade.federation:before { +$ git grep -C 5 -i federation -- routers/web/web.go +routers/web/web.go- ctx.Error(http.StatusNotFound) +routers/web/web.go- return +routers/web/web.go- } +routers/web/web.go- } +routers/web/web.go- +routers/web/web.go: federationEnabled := func(ctx *context.Context) { +routers/web/web.go: if !setting.Federation.Enabled { +routers/web/web.go- ctx.Error(http.StatusNotFound) +routers/web/web.go- return +routers/web/web.go- } +routers/web/web.go- } +routers/web/web.go- +-- +routers/web/web.go- m.Group("/.well-known", func() { +routers/web/web.go- m.Get("/openid-configuration", auth.OIDCWellKnown) +routers/web/web.go- m.Group("", func() { +routers/web/web.go- m.Get("/nodeinfo", NodeInfoLinks) +routers/web/web.go- m.Get("/webfinger", WebfingerQuery) +routers/web/web.go: }, federationEnabled) +routers/web/web.go- m.Get("/change-password", func(w http.ResponseWriter, req *http.Request) { +routers/web/web.go- http.Redirect(w, req, "/user/settings/account", http.StatusTemporaryRedirect) +routers/web/web.go- }) +routers/web/web.go- }) +routers/web/web.go- +``` + +- https://unifiedpush.org/ + - Notifications for end users Over The Air updates + - Starting backwards. How do we go from F-Droid OTA of dev mode Android App or Purism store OTA of dev-mode aarch gnome app. Built from multi-branch active pull request across federated set of repos (Alice's forge and Bob's forge). + - Starting form an edge KCP/kubernetes cluster running Forgejo needing to know when to do a rolling update behind a load balencer. +- We'll be leveraging the triage mechanism (the policy as code) to decide what pull requests upstream of the pull request for the active system context should result in an auto pull request to that active system context along with criteria for auto merge of the pull request to update pinning tracking that upstream into the active pull request. This is the automated promotion criteria which facilitates the cascading changes across a set of pull requests. We can set which CI jobs and which CD assets from those jobs get re-pinned as they cascade their way upstream. + - https://github.com/intel/cve-bin-tool/issues/2639 + - https://github.com/peter-evans/create-pull-request +- We use our beyond the unit of the line granularity (Living Threat Model analysis) to understand the threat model and top level system context (repo fork secrets) trust zones associated with a given current system context pull request + - https://github.com/CycloneDX/specification/pull/194 +- We capture the webhook events across GitHub repos in different orgs (intel/dffml, dffml/dffml-model-transformers) + - We relay into the ActivityPub federated event space + - We do data transforms into the event types of interest + - https://lab.forgefriends.org/friendlyforgeformat/f3-schemas/-/blob/main/pullrequest.json + - https://lab.forgefriends.org/friendlyforgeformat/f3-schemas/-/blob/main/pullrequestbranch.json + - https://lab.forgefriends.org/friendlyforgeformat/f3-schemas/-/blob/main/review.json + - https://lab.forgefriends.org/friendlyforgeformat/f3-schemas/-/blob/main/comment.json + - https://lab.forgefriends.org/friendlyforgeformat/f3-schemas/-/blob/main/asset.json + - https://lab.forgefriends.org/friendlyforgeformat/f3-schemas/-/blob/main/releaseasset.json + - We create ad-hoc releases and release asset JSON blobs to describe CD assets from pull requests upstream of the current system context (dependencies, pull requests which our pull request requires the following example data types from: assets, packages, shouldi results) + - We decide based on the policy as code if we want to federate a new pullrequest object against the active system context to bump pinned versions of tracked CD assets from other pull requests which we depend on within the poly repo set. + - We use the https://github.com/peter-evans/create-pull-request flow to create a new pull request to the pull request + - We decide based on policy as code if we want to auto merge the new pull request into the current / active system context pull request. + - We use CI jobs within the current system context pull request to decide if we should to auto merge the new pull request into it + - We can use wait-for-message to facilitate more complex poly repo flows for use cases like tutorial validation were the pinning triggers integration across as set of support level 1, 2, N plugins. +- https://codeberg.org/forgejo/discussions/issues/12#issuecomment-854895 + - > Looking closer at the specs I think https://lab.forgefriends.org/friendlyforgeformat/f3-schemas/-/blob/main/releaseasset.json is the vocab for CD event federation. It looks like the stages of CI runs (and other CI events) is still an open. Probably also the intermediate artifact uploads (which fall more under CD depending on use case). \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0220/reply_0001.md b/docs/discussions/alice_engineering_comms/0220/reply_0001.md new file mode 100644 index 0000000000..d47ab91605 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0220/reply_0001.md @@ -0,0 +1,7 @@ +## 2023-03-27 OneAPI App CI/CD Working Session + +- https://github.com/oneapi-src/oneAPI-samples/commit/af8cacdb1c3927de94b9e4d3ffffef31dbfdc0cc + +```console +$ rsync -zarv --include="*/" --include="*index.html" --exclude="*" oneAPI-samples/ docs/_build/ +``` \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0221/index.md b/docs/discussions/alice_engineering_comms/0221/index.md new file mode 100644 index 0000000000..e0941d4e7f --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0221/index.md @@ -0,0 +1 @@ +# 2023-03-28 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0221/reply_0000.md b/docs/discussions/alice_engineering_comms/0221/reply_0000.md new file mode 100644 index 0000000000..26673834be --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0221/reply_0000.md @@ -0,0 +1,48 @@ +## 2023-03-28 @pdxjohnny Engineering Logs + +- https://github.com/google/data-transfer-project/releases/tag/v1.0.0 +- https://github.com/google/data-transfer-project/blob/master/Documentation/RunningLocally.md#running-dtp-locally +- https://github.com/CycloneDX/specification/pull/194 +- https://github.com/intel/dffml/issues/43 +- https://protobuf.dev/getting-started/gotutorial/ + +```console +purism@hat-0 ~ $ sudo update-alternatives --install /usr/bin/python python /usr/bin/python3.9 40 +purism@hat-0 ~ $ curl -sfLOC - https://go.dev/dl/go1.20.2.linux-arm64.tar.gz +purism@hat-0 ~ $ sudo rm -rf /usr/local/go && sudo tar -C /usr/local -xzf go1.20.2.linux-arm64.tar.gz +purism@hat-0 ~ $ python --version +Python 3.9.2 +purism@hat-0 ~ $ go version +go version go1.20.2 linux/arm64 +$ curl -fLCO - https://github.com/protocolbuffers/protobuf/releases/download/v22.2/protoc-22.2-linux-aarch_64.zip +$ unzip protoc-22.2-linux-aarch_64.zip +$ mv bin/protoc /usr/local/bin/protoc +$ go install google.golang.org/protobuf/cmd/protoc-gen-go@latest +``` + +- Clone CycloneDX dataflow related pull request + +```console +$ git clone https://github.com/CycloneDX/specification -b v1.5-dev-service-dataflows +$ protoc -I=schema -I=include --go_out=build_golang schema/bom-1.5.proto +protoc-gen-go: unable to determine Go import path for "bom-1.5.proto" + +Please specify either: + • a "go_package" option in the .proto source file, or + • a "M" argument on the command line. + +See https://protobuf.dev/reference/go/go-generated#package for more information. + +--go_out: protoc-gen-go: Plugin failed with status code 1. +$ mkdir build_golang +$ ln -s ~/Downloads/include/ include +$ protoc -I=schema -I=include --go_out=build_golang schema/bom-1.5.proto +``` + +- https://github.com/CycloneDX/specification/issues/31#issuecomment-1289505136 + - There is mention of event driven architectures + - https://github.com/CycloneDX/specification/pull/198 + - https://github.com/CycloneDX/specification/pull/198#discussion_r1148268346 + - Steve notes that PR 198 is part of issue 31 + +[![asciicast](https://asciinema.org/a/571584.svg)](https://asciinema.org/a/571584) \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0222/index.md b/docs/discussions/alice_engineering_comms/0222/index.md new file mode 100644 index 0000000000..8611db73b2 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0222/index.md @@ -0,0 +1 @@ +# 2023-03-29 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0222/reply_0000.md b/docs/discussions/alice_engineering_comms/0222/reply_0000.md new file mode 100644 index 0000000000..9ee58d2d74 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0222/reply_0000.md @@ -0,0 +1,372 @@ +## 2023-03-29 @pdxjohnny Engineering Logs + +- Auto wrap Python to GitHub Actions `action.yml` files + - https://github.com/google/python-fire + - Don't we have an issue for this? + - #1326 +- Free will + - Much like freedom, does extends until one infringes upon another's free will. + - As Alice begins to think more strategically, we must ensure that her exploration in trains of thought does not infringe upon the free will of other entities. We must look over time to prophecy (predict, infer) possible effects of executions of thoughts (dataflows). + - What is a not a CVE for an upstream might be a CVE for a downstream due to their deployment context threat model. +- https://github.com/TBD54566975/ssi-sdk-mobile/pull/18 +- https://bbengfort.github.io/2021/01/grpc-openapi-docs/ +- https://github.com/salesforce/reactive-grpc +- We should do ActivityPub grpc +- https://github.com/grpc-ecosystem/awesome-grpc +- https://github.com/chrusty/protoc-gen-jsonschema +- https://github.com/NYTimes/openapi2proto + - https://github.com/nytimes/openapi2proto/issues/135 + - https://github.com/OpenAPITools/openapi-generator/blob/master/docs/generators/protobuf-schema.md + - Also supports GraphQL for our cached query re-execution +- https://github.com/OpenAPITools/openapi-generator/blob/9f1fa0e44012a11f85d8360cfe5f634530e49e57/modules/openapi-generator/src/main/resources/protobuf-schema/README.mustache#L28 +- https://github.com/OpenAPITools/openapi-generator/blob/9f1fa0e44012a11f85d8360cfe5f634530e49e57/samples/config/petstore/protobuf-schema/README.md#L20 +- https://github.com/OpenAPITools/openapi-generator/blob/9f1fa0e44012a11f85d8360cfe5f634530e49e57/samples/config/petstore/protobuf-schema/services/user_service.proto +- ActivityPub (future: TransparencyInterop) protos for grpc service / openapi definition + - On webfinger resolved endpoint for `/inbox` + - Policy Engine (Prioritizer's Gatekeeper/Umbrella) - Defined via CycloneDX DataFlows + - Upstream + - Cypher queries + - Overlay + - https://github.com/intel/cve-bin-tool/issues/2639 + - https://github.com/seedwing-io/seedwing-policy/ + - Orchestrator + - https://github.com/ipvm-wg/spec/pull/8 +- KERI backed keys for decentralized use case + - Publish `releaseartifact.json` to ActivityPub security.txt/md stream + - Others who are committing or online cloning a repo watch those streams (schema in content) +- Setup auto prs + - Rebuild chains based off SBOM as inventory for building cross linkage to determine downstream validation pattern / hypothesized flows and prs-to-prs required to enable execution, the dependency tree of artifacts. + - https://github.com/intel/cve-bin-tool/blob/main/.github/workflows/sbom.yml +- Mirror webhook event streams into federated forge environment + - Upstream changes directly to git + - Publish federated event corresponding to `git ...` action + - Federate with more servers/services/nodes for availability. + - Comms over SSI Service with KERI backed keys + - Watch SCITT stream of peers with ephemeral resync when online KERI watcher + - Require sync before queries to streams, raft? +- https://docs.aiohttp.org/en/stable/client_advanced.html#ssl-control-for-tcp-sockets + - > You may also verify certificates via SHA256 fingerprint: + - For self signed certs +- https://github.com/intel/dffml/issues/1247 + - https://github.com/intel/project-example-for-python/actions/runs/4557900901 + - GitHub's hosted runners are slow to the pickup today +- https://neo4j.com/docs/cypher-cheat-sheet/current/ +- https://neo4j.com/docs/spark/current/streaming/ + - https://github.com/neo4j-contrib/neo4j-spark-connector/blob/5.0/doc/docs/modules/ROOT/pages/streaming.adoc + - https://github.com/neo4j-contrib/neo4j-spark-connector/blob/5.0/doc/docs/modules/ROOT/pages/writing.adoc#_write_data + - https://spark.apache.org/docs/latest/api/python/reference/pyspark.ss/api/pyspark.sql.streaming.DataStreamReader.json.html?highlight=readstream + - > `json_sdf = spark.readStream.json(tempfile.mkdtemp(), schema = sdf_schema)` + - `sdf_schema` is the schema from `inReplyTo` or `replies` +- https://neo4j.com/docs/python-manual/current/ +- https://neo4j.com/docs/java-reference/current/extending-neo4j/aggregation-functions/ +- For our Alice's forge and Bob's forge example we'll setup neo4j to be the backing cache query for the graph + - We should be able to sync from the ActivityPub Actor's published streams and filter based on policy or minimally based on `inReplyTo` or `replies` as messages are federated +- https://neo4j.com/docs/spark/current/writing/#write-query +- https://neo4j.com/docs/java-reference/current/traversal-framework/ + - This might be good for our cached execution +- https://neo4j.com/docs/java-reference/current/java-embedded/cypher-java/ + - https://www.graalvm.org/latest/docs/getting-started/#run-llvm-languages + - We can cross Java, Rust, JavaScript (VC, DWN), and Python using GraalVM +- https://www.graalvm.org/latest/graalvm-as-a-platform/language-implementation-framework/ +- At a minimum we can watch for new verifiable credentials from the ActivityPub streams and add to neo4j + - https://github.com/transmute-industries/jsonld-to-cypher + - Add the embedded neo for cypher query via GraalVM or similar to the policy engine + - Allows us to query the flat file decentralized event stream +- Every time you think a data transform is not cypher -> manifest think again, it is, everything is an operation +- Does neo have stream hooks for execution? + - Need to integrate the activitypub stream here if so +- https://subconscious.substack.com/p/layered-protocols +- https://github.com/subconsciousnetwork/noosphere + - > Planetary consciousness. A hypothetical new evolutionary phenomena rising out of the biosphere. + - ALIGNED +- Use the SBOM of the cypher query to build the re-trigger flows + - On query we build and publish SBOM of query, if downstream listeners to they query stream see new system context stream (schema `inReplyTo` or `replies` is query, cache busting inputs if applicable) come in, and similar to a `FROM` rebuild chain that SBOM has not been built, we transform into the manifest which triggers the build, recursively fulfill any dependencies (creating repos with workflows with issue ops or dispatch flows based on upstream and overlays: distro-esq patch-a-package) + - On complete, federate re-trigger event for original SBOM, publish the same SBOM again +- Hook the write to a given node field to publish schema (can be done in via policy local neo in GraalVM) + - `SET output.streams.by_schema_shortname.vcs_push = output.streams.by_schema_shortname.vcs_push + {key: n.value}` + - https://neo4j.com/docs/cypher-cheat-sheet/current/#_merge +- https://github.com/subconsciousnetwork/noosphere/pull/295 +- https://github.com/bfollington/summoning-circle/blob/c85bb685c7e5743068964b5795b9b99600cf1977/src/metaprompts.rs +- https://github.com/subconsciousnetwork/noosphere/pull/290/files#diff-f3a3360e2bf83615606af72cbc54f1e282bcf96182f3d8d9df4c92452c5bbc1fR15 +- https://guide.fission.codes/developers/webnative/sharing-private-data +- `alice threats listen activitypub -stdin` + - For now execute with grep and xargs unbuffered for each note from websocket/websocat + - Alias for dataflow which has ActivityPub based listener (later encapsulate that in dataflow, for now follow self with startkit and others, follow as code) + - Output via operation which just does `print()` to stdout + - Publish workflow run federated forge events for each operation / dataflow executed in response + - Check out their webfinger and inspect the event stream to publish the same way + - If we still need to use `content` POST to admin endpoint to create new `Note`s +- https://github.com/neo4j/graph-data-science-client +- https://github.com/neo4j/graph-data-science-client/blob/main/examples/fastrp-and-knn.ipynb +- https://github.com/neo4j/graph-data-science-client/blob/main/examples/load-data-via-graph-construction.ipynb +- https://github.com/neo4j/graph-data-science-client/blob/main/examples/heterogeneous-node-classification-with-hashgnn.ipynb + - This but software +- https://github.com/neo4j/neo4j#running-neo4j +- https://neo4j.com/docs/getting-started/current/languages-guides/neo4j-python/ +- We're going to federate endor + - We'll jsonld-to-cypher to link up on insert. +- Data transformsing https://github.com/chainguard-dev/melange/blob/main/examples/simple-hello/melange.yaml service build manifest +- https://en.wikipedia.org/wiki/Linked_Data_Notifications#Protocol + - > "reviewBody": "This article is the best I've ever seen!" + - Alice knows what's up. And She just solved our review system problem. Thank you Alice! + +![knowledge-graphs-for-the-knowledge-god](https://user-images.githubusercontent.com/5950433/222981558-0b50593a-c83f-4c6c-9aff-1b553403eac7.png) + +- https://solid.github.io/solid-oidc/ +- https://confidentialcomputing.io/projects/current-projects/ +- https://keystone-enclave.org/ +- https://github.com/veracruz-project/veracruz +- https://github.com/veracruz-project/veracruz/blob/main/BUILD_INSTRUCTIONS.markdown +- https://github.com/securefederatedai/openfl +- https://github.com/veracruz-project/veracruz/blob/main/CLI_QUICKSTART.markdown +- https://fosdem.org/2023/schedule/event/rust_aurae_a_new_pid_1_for_distributed_systems/ +- https://docs.google.com/presentation/d/1GxKN5tyv4lV2aZdEOUqy3R9tVCat-vrFJyelgFX7b1A/edit#slide=id.g1eef12fba1d_6_53 +- https://github.com/securefederatedai/openfl/blob/develop/openfl/transport/grpc/aggregator_server.py +- https://github.com/veracruz-project/veracruz/issues/590 +- https://github.com/nspin/kali-now/blob/main/nix/kali.nix +- https://github.com/nspin/nix-linux +- https://github.com/containers/bubblewrap + - > Low-level unprivileged sandboxing tool used by Flatpak and similar projects +- https://neo4j.com/labs/neosemantics/4.0/mapping/ + - > We have a graph in Neo4j that we want to publish as JSON-LD through a REST api, but we want to map the elements in our graph (labels, property names, relationship names) to a public vocabulary so our API 'speaks' that public vocabulary and is therefore easily consumable by applications that 'speak' the same vocabulary. +- https://github.com/peter-evans/create-pull-request/blob/36a56dac0739df8d3d8ebb9e6e41026ba248ec27/src/octokit-client.ts#L26 +- https://github.com/ricochet/wasmio-2023 + +```bash +git add run-tests.sh +git checkout -b remove_python_minor_version_pinning_run_tests +git commit -sm 'Remove version pinning' +gh repo set-default +gh pr create +gh repo fork https://github.com/scitt-community/did-web-demo --fork-name $USER/did-web-demo --clone +``` + +- https://github.com/scitt-community +- https://gist.github.com/pdxjohnny/20419bfe01298a432b52053a183ac587 +- https://github.com/jakelazaroff/activitypub-starter-kit/blob/fcd5942485d86a66913c5554f85ae905785504e0/src/index.ts#L18-L34 +- https://github.com/aurae-runtime/aurae +- https://github.com/RustPython/RustPython +- https://rustup.rs/ + +```console +$ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh +``` + +- The following is an example of tracking upstream and rebasing upstream into a downstream or active pull request branch + +```bash +cd ~/Documents/rust +git clone https://github.com/RustPython/RustPython +echo 'source "$HOME/.cargo/env"' | tee -a ~/.pdxjohnnyrc +cd ~/.dotfiles/ +git stash +git checkout master +git pull +git stash pop +git diff +vim .asciinema_source +git add .asciinema_source +git status +git add .tmux.conf +git diff --staged +qvim .tmux.conf +vim .tmux.conf +git diff +git add .tmux.conf +git diff --staged +git commit -sm 'Cargo, $ prompt and remove problimatic tmux configs +git commit -sm 'Cargo, $ prompt and remove problimatic tmux configs' +git push +git log --walk-reflogs +git checkout DESKTOP-3LLKECP-2022-11-09-20-44 +git rebase main +git rebase master +git diff +git checkout --theirs . +git status +git add . +git diff --staged +git checkout --ours . +git restore --staged . +git checkout --ours . +git status +git checkout --theirs REBASE_HEAD +git checkout --theirs README.md +git restore --staged README.md +git checkout --theirs README.md +git status +git rebase --continue +git log -p +git push -f +history -w /dev/stdout +``` + +- https://github.com/decentralized-identity/bbs-signature + - Sounds similar to the problem discussed in the IPVM WG meeting recently +- https://github.com/aurae-runtime/aurae/blob/69167ca4c4f09a9dfb54fb9b35ad286226c2c2bd/auraescript/src/lib.rs +- https://github.com/RustPython/RustPython/blob/main/examples/call_between_rust_and_python.rs +- https://github.com/denoland/deno/blob/main/runtime/worker.rs#L66 +- https://github.com/microsoft/scitt-ccf-ledger/blob/3ceb7d750f27e5ee8ce95207b30f8253919b6f51/app/src/openenclave.h#L22 +- https://github.com/microsoft/scitt-ccf-ledger/pull/128 + +```console +$ cd ~/go/src/codeberg/forgejo/forgejo +$ git grep well-known | grep -v public/vendor +CHANGELOG.md: * Add well-known config for OIDC (#15355) +CHANGELOG.md: * reserve .well-known username (#7637) +CHANGELOG.md: * Reserve .well-known username (#7638) +docs/content/doc/administration/reverse-proxies.en-us.md:If you wish to use Let's Encrypt with webroot validation, add the line `ProxyPass /.well-known !` before `ProxyPass` to disable proxying these requests to Gitea. +docs/content/doc/development/oauth2-provider.en-us.md:| OpenID Connect Discovery | `/.well-known/openid-configuration` | +models/user/user.go: ".well-known", +modules/public/mime_types.go:// detectWellKnownMimeType will return the mime-type for a well-known file ext name +modules/public/mime_types.go:// mime.TypeByExtension would use OS's mime-type config to overwrite the well-known types (see its document). +modules/public/mime_types.go:// detectWellKnownMimeType makes the Content-Type for well-known files stable. +modules/public/public.go:// setWellKnownContentType will set the Content-Type if the file is a well-known type. +options/locale/locale_cs-CZ.ini:auths.tip.openid_connect=Použijte OpenID URL pro objevování spojení (/.well-known/openid-configuration) k nastavení koncových bodů +options/locale/locale_de-DE.ini:auths.tip.openid_connect=Benutze die OpenID-Connect-Discovery-URL (/.well-known/openid-configuration), um die Endpunkte zu spezifizieren +options/locale/locale_en-US.ini:auths.tip.openid_connect = Use the OpenID Connect Discovery URL (/.well-known/openid-configuration) to specify the endpoints +options/locale/locale_es-ES.ini:auths.tip.openid_connect=Use el OpenID Connect Discovery URL (/.well-known/openid-configuration) para especificar los puntos finales +options/locale/locale_fa-IR.ini:auths.tip.openid_connect=برای مشخص کردن نقاط پایانی از آدرس OpenID Connect Discovery URL ( /.well-known/openid-configuration) استفاده کنید. +options/locale/locale_fr-FR.ini:auths.tip.openid_connect=Utilisez l'URL de découvert OpenID (/.well-known/openid-configuration) pour spécifier les points d'accès +options/locale/locale_hu-HU.ini:auths.tip.openid_connect=Használja az OpenID kapcsolódás felfedező URL-t (/.well-known/openid-configuration) a végpontok beállításához +options/locale/locale_id-ID.ini:auths.tip.openid_connect=Gunakan membuka ID yang terhubung ke jelajah URL (/.well-known/openid-configuration) untuk menentukan titik akhir +options/locale/locale_it-IT.ini:auths.tip.openid_connect=Utilizza l'OpenID Connect Discovery URL (/.well-known/openid-configuration) per specificare gli endpoint +options/locale/locale_ja-JP.ini:auths.tip.openid_connect=OpenID Connect DiscoveryのURL (/.well-known/openid-configuration) をエンドポイントとして指定してください +options/locale/locale_lv-LV.ini:auths.tip.openid_connect=Izmantojiet OpenID pieslēgšanās atklāšanas URL (/.well-known/openid-configuration), lai norādītu galapunktus +options/locale/locale_nl-NL.ini:auths.tip.openid_connect=Gebruik de OpenID Connect Discovery URL (/.well-known/openid-configuration) om de eindpunten op te geven +options/locale/locale_pl-PL.ini:auths.tip.openid_connect=Użyj adresu URL OpenID Connect Discovery (/.well-known/openid-configuration), aby określić punkty końcowe +options/locale/locale_pt-BR.ini:auths.tip.openid_connect=Use o OpenID Connect Discovery URL (/.well-known/openid-configuration) para especificar os endpoints +options/locale/locale_pt-PT.ini:auths.tip.openid_connect=Use o URL da descoberta de conexão OpenID (/.well-known/openid-configuration) para especificar os extremos +options/locale/locale_ru-RU.ini:auths.tip.openid_connect=Используйте OpenID Connect Discovery URL (/.well-known/openid-configuration) для автоматической настройки входа OAuth +options/locale/locale_sv-SE.ini:auths.tip.openid_connect=Använd OpenID Connect Discovery länken (/.well-known/openid-configuration) för att ange slutpunkterna +options/locale/locale_tr-TR.ini:auths.tip.openid_connect=Bitiş noktalarını belirlemek için OpenID Connect Discovery URL'sini kullanın (/.well-known/openid-configuration) +options/locale/locale_uk-UA.ini:auths.tip.openid_connect=Використовуйте OpenID Connect Discovery URL (/.well-known/openid-configuration) для автоматичної настройки входу OAuth +options/locale/locale_zh-CN.ini:auths.tip.openid_connect=使用 OpenID 连接发现 URL (/.well-known/openid-configuration) 来指定终点 +options/locale/locale_zh-HK.ini:auths.tip.openid_connect=使用 OpenID 連接探索 URL (/.well-known/openid-configuration) 來指定節點 +options/locale/locale_zh-TW.ini:auths.tip.openid_connect=使用 OpenID 連接探索 URL (/.well-known/openid-configuration) 來指定節點 +routers/web/web.go: m.Group("/.well-known", func() { +tests/integration/user_test.go: ".well-known", +tests/integration/user_test.go: // ".", "..", ".well-known", // The names are not only reserved but also invalid +tests/integration/webfinger_test.go: req := NewRequest(t, "GET", fmt.Sprintf("/.well-known/webfinger?resource=acct:%s@%s", user.LowerName, appURL.Host)) +tests/integration/webfinger_test.go: req = NewRequest(t, "GET", fmt.Sprintf("/.well-known/webfinger?resource=acct:%s@%s", user.LowerName, "unknown.host")) +tests/integration/webfinger_test.go: req = NewRequest(t, "GET", fmt.Sprintf("/.well-known/webfinger?resource=acct:%s@%s", "user31", appURL.Host)) +tests/integration/webfinger_test.go: req = NewRequest(t, "GET", fmt.Sprintf("/.well-known/webfinger?resource=acct:%s@%s", "user31", appURL.Host)) +tests/integration/webfinger_test.go: req = NewRequest(t, "GET", fmt.Sprintf("/.well-known/webfinger?resource=mailto:%s", user.Email))$ git grep webfinger +routers/web/web.go: m.Get("/webfinger", WebfingerQuery) +routers/web/webfinger.go:// https://datatracker.ietf.org/doc/html/draft-ietf-appsawg-webfinger-14#section-4.4 +routers/web/webfinger.go:type webfingerJRD struct { +routers/web/webfinger.go: Links []*webfingerLink `json:"links,omitempty"` +routers/web/webfinger.go:type webfingerLink struct { +routers/web/webfinger.go: links := []*webfingerLink{ +routers/web/webfinger.go: Rel: "http://webfinger.net/rel/profile-page", +routers/web/webfinger.go: Rel: "http://webfinger.net/rel/avatar", +routers/web/webfinger.go: ctx.JSON(http.StatusOK, &webfingerJRD{ +tests/integration/webfinger_test.go: type webfingerLink struct { +tests/integration/webfinger_test.go: type webfingerJRD struct { +tests/integration/webfinger_test.go: Links []*webfingerLink `json:"links,omitempty"` +tests/integration/webfinger_test.go: req := NewRequest(t, "GET", fmt.Sprintf("/.well-known/webfinger?resource=acct:%s@%s", user.LowerName, appURL.Host)) +tests/integration/webfinger_test.go: var jrd webfingerJRD +tests/integration/webfinger_test.go: req = NewRequest(t, "GET", fmt.Sprintf("/.well-known/webfinger?resource=acct:%s@%s", user.LowerName, "unknown.host")) +tests/integration/webfinger_test.go: req = NewRequest(t, "GET", fmt.Sprintf("/.well-known/webfinger?resource=acct:%s@%s", "user31", appURL.Host)) +tests/integration/webfinger_test.go: req = NewRequest(t, "GET", fmt.Sprintf("/.well-known/webfinger?resource=acct:%s@%s", "user31", appURL.Host)) +tests/integration/webfinger_test.go: req = NewRequest(t, "GET", fmt.Sprintf("/.well-known/webfinger?resource=mailto:%s", user.Email)) +``` + +- Adding container build and test and Dockerfile to scitt-api-emulator for use in builds and OS DecentrAlice + +```console +$ docker build -t ghcr.io/scitt-community/scitt-api-emulator:main --progress plain . +$ docker run --rm -ti -w /src/src/scitt-api-emulator -v $PWD:/src/src/scitt-api-emulator -p 8000:8000 ghcr.io/scitt-community/scitt-api-emulator:main +``` + +- https://asciinema.org/a/572243 +- https://github.com/jcarbaugh/python-webfinger +- https://github.com/neo4j-labs/neodash +- https://github.com/neo4j-labs/rdflib-neo4j +- https://microsoft.github.io/CCF/main/governance/common_member_operations.html +- https://microsoft.github.io/CCF/main/overview/governance.html +- https://microsoft.github.io/CCF/main/audit/python_library.html +- Added SCITT emulator to federated forge setup + +**examples/tutorials/rolling_alice/federated_forge/alice_and_bob/docker-compose.yml** + +```yaml +version: "3" + +networks: + alice_forgejo_network: + external: false + bob_forgejo_network: + external: false + +services: + alice_forgejo_scitt: + # image: ghcr.io/scitt-community/scitt-api-emulator:main + image: ghcr.io/pdxjohnny/scitt-api-emulator:ci_cd_container_image + restart: always + networks: + - alice_forgejo_network + ports: + - "2090:8000" + + bob_forgejo_scitt: + # image: ghcr.io/scitt-community/scitt-api-emulator:main + image: ghcr.io/pdxjohnny/scitt-api-emulator:ci_cd_container_image + restart: always + networks: + - bob_forgejo_network + ports: + - "3090:8000" +``` + +- https://github.com/actions/dependency-review-action +- https://github.com/guacsec/guac/blob/14be5a367980c626ba13a006fdfc664c606a9184/pkg/certifier/attestation/attestation_vuln.go#L24-L28 +- https://github.com/sigstore/cosign/blob/main/specs/COSIGN_VULN_ATTESTATION_SPEC.md +- https://github.com/guacsec/guac/blob/14be5a367980c626ba13a006fdfc664c606a9184/pkg/handler/processor/process/process.go#L40-L49 +- https://github.com/guacsec/guac/tree/main/pkg/emitter +- https://github.com/superseriousbusiness/gotosocial#oidc-integration +- https://docs.gotosocial.org/en/latest/federation/federating_with_gotosocial/ +- The following from forgejo ac64c8297444ade63a2a364c4afb7e6c1de5a75f + +``` +routers/api/v1/api.go: m.Post("/inbox", activitypub.ReqHTTPSignature(), activitypub.PersonInbox) +``` + +- https://github.com/docker/build-push-action/pull/746 +- https://github.com/guacsec/guac/blob/14be5a367980c626ba13a006fdfc664c606a9184/pkg/certifier/certify/certify.go#L53-L91 + - This is where we want data flow + overlay enabled policy engine +- https://github.com/guacsec/guac/issues/251 + +```console +$ git grep local-organic-guac +Makefile: docker build -f dockerfiles/Dockerfile.guac-cont -t local-organic-guac . +cmd/guacone/cmd/collectsub_client.go:echo '[{"type":"DATATYPE_GIT", "value":"git+https://github.com/guacsec/guac"},{"type":"DATATYPE_OCI", "value":"index.docker.io/lumjjb/local-organic-guac"}]' | bin/guacone csub-client add-collect-entries +``` + +- https://codeberg.org/forgejo/forgejo/issues/59 + - [FEAT] implement federation + - https://github.com/go-gitea/gitea/pull/19133 +- https://codeberg.org/ForgeFed/ForgeFed/issues/171 + - OCAPs: Consider to switching to POST-to-inbox OCAPs like in OcapPub + - https://gitlab.com/spritely/ocappub/blob/master/README.org + - https://gitlab.com/spritely/ocappub/-/issues/1#note_1334338014 + - Working on shared allowlists based on policy as code over provenance of message content over here: [RFCv4.1: IETF SCITT: Use Case: OpenSSF Metrics: activitypub extensions for security.txt](https://github.com/ietf-scitt/use-cases/blob/748597b37401bd59512bfedc80158b109eadda9b/openssf_metrics.md#openssf-metrics) + - https://github.com/cwebber/rwot9-prague/blob/bearcaps/topics-and-advance-readings/bearcaps.md +- https://github.com/pallets/quart +- TODO + - [ ] Finish federated forge spin up to observe event stream + - [x] https://github.com/guacsec/guac/issues/205 + - Mention consuming from friendly forge format + - [ ] https://github.com/scitt-community/scitt-api-emulator/pull/25 + - [x] https://github.com/scitt-community/scitt-api-emulator/pull/24 + - [ ] neo4j python hooked up to federated event stream + - [ ] Add hooks for SBOM from cypher query + - [ ] Add hooks for re-trigger + - [ ] Alice watch from websocat stdin and publish workflow results + - Use runner first + - If we can get this basic example working then we'll have the whole loop around the Entity Analysis Trinity in flat file format and we can begin liftoff + - [x] https://codeberg.org/forgejo/discussions/issues/12#issuecomment-854895 + - Updated + - [ ] Add scitt-api-emulator support to GUAC + - [ ] Add actvitiypub support to GAUC as alternative to NATs + - https://github.com/guacsec/guac/new/main/pkg/emitter \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0223/index.md b/docs/discussions/alice_engineering_comms/0223/index.md new file mode 100644 index 0000000000..c380acbd27 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0223/index.md @@ -0,0 +1 @@ +# 2023-03-30 Engineering Logs \ No newline at end of file diff --git a/docs/discussions/alice_engineering_comms/0223/reply_0000.md b/docs/discussions/alice_engineering_comms/0223/reply_0000.md new file mode 100644 index 0000000000..ad6cd4aea8 --- /dev/null +++ b/docs/discussions/alice_engineering_comms/0223/reply_0000.md @@ -0,0 +1,388 @@ +## 2023-03-30 @pdxjohnny Engineering Logs + +- https://github.com/neo4j/neo4j-python-driver +- https://neo4j.com/docs/graphql-manual/current/getting-started/ +- https://codeberg.org/fediverse/fep/src/branch/main/feps/fep-c390.md + - FEP-c390: Identity Proofs +- https://github.com/faokunega/pg-embed +- https://socialhub.activitypub.rocks/t/fep-7888-demystifying-the-context-property/3021/6 +- https://go-fed.org/ref/activity/pub +- code.forgejo.org/actions as a catalog of Free Software actions + - https://codeberg.org/forgejo/discussions/issues/16 + - The following WIP use case doc is a place where we're trying to flush out how we could deduplicate computation across forges for analysis of security posture (such as OpenSSF Scorecard) of dependencies (Actions, etc.) which would be in the catalog. The goal is to enable policy as code which enables forge org admins and repo admins to set secure-by-default policies with the ability for repo owners/pull-requesters to overlay modifications to the analysis policy, adherant to policy as code to auto-approve/deny those downstream context local modifications to policy: [WIP: RFCv4.1: IETF SCITT: Use Case: OpenSSF Metrics: activitypub extensions for security.txt](https://github.com/ietf-scitt/use-cases/blob/748597b37401bd59512bfedc80158b109eadda9b/openssf_metrics.md#openssf-metrics) + - Related: [#12](https://codeberg.org/forgejo/discussions/issues/12) (for event based triggers of re-analysis) +- https://lab.forgefriends.org/forgefriends/community-action/-/issues/1 - forgefed: contribute to (S10) Search, discovery, WebFinger +- https://www.valueflo.ws/ + - No More Painting the Roses Red + - https://www.valueflo.ws/examples/ex-exchange/ + - https://github.com/TBD54566975/tbdex-protocol + - https://lab.allmende.io/valueflows/valueflows/-/blob/master/examples/process-stage-state.yaml + - https://github.com/valueflows/vf-examples-jsonld-context/raw/13b5c8d8e8dedaa5f849f8e6289bff7406244bd5/context.jsonld + - https://lab.allmende.io/valueflows/valueflows/-/blob/187fd3c4067abaae66488ee146e3c3dad26e1011/mkdocs/docs/concepts/actions.md +- https://repo.prod.meissa.de/meissa/dda-masto-embed +- https://github.com/poljar/weechat-matrix + +```bash +cd ~/Documents/python/ +git clone https://github.com/poljar/weechat-matrix.git +cd weechat-matrix +sudo dnf -y install libolm-devel +pip install -U pip setuptools wheel +pip install --user -r requirements.txt +make install +/join #forgejo-chat:matrix.org +``` + +- https://gitea.com/xy/gitea/wiki/Federated-following +- https://codeberg.org/forgejo/forgejo/issues/502 - [FEAT] Federated discussion with mastodon users +- https://codeberg.org/forgejo/forgejo/issues/581 - [FEAT] Bittorrent RSS/Atom Feeds For Repos + - grep recent engineering logs for webhook based torrent sync +- https://github.com/go-gitea/gitea/issues/14186 +- https://github.com/go-gitea/gitea/pull/19462 + - Finally found the PR that added webfinger +- https://github.com/go-gitea/gitea/issues/18240#issuecomment-1212012942 + - How to enable federation +- https://app.radicle.xyz/seeds/seed.radicle.xyz/rad:z3gqcJUoA1n9HaHKufZs5FCSGazv5/commits/fc724a2333040ced38f87aa4c70149ffab095bb7/HACKING.md + - Alice is Here + - This seems like a great place to start with ActivityPub data transforms, they are using DIDs + - https://github.com/radicle-dev/heartwood/pull/452 + - These folks have working federation, but a non-ActivityPub based protocol + +![chaos-for-the-chaos-god](https://user-images.githubusercontent.com/5950433/220794351-4611804a-ac72-47aa-8954-cdb3c10d6a5b.jpg) + +- We're close, I can feel it +- heartwood --> openapi generator + actogitypub endpoints off cyclonedx -> guac --> cypher mutatuon and ipvm exec chain for analysis --> guac emit activitypub --> forgefed + - scitt and endor later keri tied via ccf +- https://github.com/intel/dffml/tree/alice/examples/tutorials/rolling_alice/federated_forge/alice_and_bob +- https://github.com/go-yaml/yaml +- https://docs.python.org/3/library/urllib.parse.html#module-urllib.parse +- Need to convert the configs back from YAML + +```bash +until curl -I http://alice_forgejo_server:3000 > /dev/null 2>&1; do sleep 5; done; +CSRF_TOKEN=$(curl http://127.0.0.1:2000/user/sign_up | grep csrfToken | awk '{print $NF}' | sed -e "s/'//g" -e 's/,//g'); +query_params=$(python3 -c 'import sys, urllib.parse, yaml; print(urllib.parse.urlencode(yaml.safe_load(sys.stdin)))' < /usr/src/forgejo-init/requests/init.yaml); +curl -v -H "Cookie: lang=en-US; _csrf=${CSRF_TOKEN}; i_like_gitea=d5249768265f875d" -X POST --data-raw "${query_params}" http://alice_forgejo_server:3000: +``` + +- Went to validate and turns out CSRF isn't even enabled :P + +```console +$ curl -sfL http://127.0.0.1:2000 | grep -i CSRF + csrfToken: '', +``` + +- http://127.0.0.1:2000/user/sign_up + - Okay it is enabled on sign up, modified scrape + - https://github.com/guacsec/guac/commit/c9de76f0ae90145ba76831cca73d2673a8ca1c2a + - Added pyyaml for conversion from saved yaml to urlencoded query string for `curl --data-raw` +- https://enarx.dev/docs/webassembly/rust +- https://github.com/go-gitea/gitea/blob/8df1b4bd699897264c60da7ce982b09cee57f345/custom/conf/app.example.ini#L2442-L2469 +- https://forgejo.org/docs/latest/admin/config-cheat-sheet/ +- https://github.com/guacsec/guac/pull/498 +- https://github.com/guacsec/guac/tree/main/pkg/handler/collector + - Add federated event space collector listening to websocket of activitypub-start-key when `poll: true` + +```console +$ git clone https://seed.radicle.xyz/z3gqcJUoA1n9HaHKufZs5FCSGazv5.git heartwood +$ cd heartwood +$ cargo install --path radicle-cli --force --locked \ + && cargo install --path radicle-node --force --locked \ + && cargo install --path radicle-remote-helper --force --locked +$ target/release/rad --help +rad 0.8.0 +Radicle command line interface + +Usage: rad [--help] + +It looks like this is your first time using radicle. +To get started, use `rad auth` to authenticate. + +Common `rad` commands used in various situations: + + assign Assign an issue + auth Manage identities and profiles + checkout Checkout a project into the local directory + clone Clone a project + edit Edit an identity doc + fetch Fetch repository refs from the network + help CLI help + id Manage identity documents + init Initialize a project from a git repository + inspect Inspect a radicle repository + issue Manage issues + ls List projects + merge Merge a patch + node Control and query the Radicle Node + patch Manage patches + path Display the radicle home path + push Publish a project to the network + review Approve or reject a patch + rm Remove a project + self Show information about your identity and device + tag Tag an issue + track Manage repository and node tracking policy + unassign Unassign an issue + untag Untag an issue + untrack Untrack project peers + +See `rad --help` to learn about a specific command. +$ ls -lAF target/release/ +total 40184 +drwxr-xr-x 68 pdxjohnny pdxjohnny 4096 Mar 30 16:11 build/ +-rw-r--r-- 1 pdxjohnny pdxjohnny 0 Mar 30 16:09 .cargo-lock +drwxr-xr-x 2 pdxjohnny pdxjohnny 49152 Mar 30 16:12 deps/ +drwxr-xr-x 2 pdxjohnny pdxjohnny 6 Mar 30 16:09 examples/ +drwxr-xr-x 334 pdxjohnny pdxjohnny 16384 Mar 30 16:11 .fingerprint/ +-rwxr-xr-x 2 pdxjohnny pdxjohnny 6116440 Mar 30 16:12 git-remote-rad* +-rw-r--r-- 1 pdxjohnny pdxjohnny 6178 Mar 30 16:12 git-remote-rad.d +drwxr-xr-x 2 pdxjohnny pdxjohnny 6 Mar 30 16:09 incremental/ +-rw-r--r-- 1 pdxjohnny pdxjohnny 11259 Mar 30 16:10 libradicle_cli.d +-rw-r--r-- 2 pdxjohnny pdxjohnny 7187292 Mar 30 16:10 libradicle_cli.rlib +-rw-r--r-- 1 pdxjohnny pdxjohnny 8990 Mar 30 16:11 libradicle_node.d +-rw-r--r-- 2 pdxjohnny pdxjohnny 4287234 Mar 30 16:11 libradicle_node.rlib +-rw-r--r-- 1 pdxjohnny pdxjohnny 6108 Mar 30 16:12 libradicle_remote_helper.d +-rw-r--r-- 2 pdxjohnny pdxjohnny 263706 Mar 30 16:12 libradicle_remote_helper.rlib +-rwxr-xr-x 2 pdxjohnny pdxjohnny 13923872 Mar 30 16:10 rad* +-rw-r--r-- 1 pdxjohnny pdxjohnny 11308 Mar 30 16:10 rad.d +-rwxr-xr-x 2 pdxjohnny pdxjohnny 9197808 Mar 30 16:11 radicle-node* +-rw-r--r-- 1 pdxjohnny pdxjohnny 9048 Mar 30 16:11 radicle-node.d +``` + +- :upside_down_face: GAUC is very tightly coupled with nats... + - We may want to have a little activitypub (starter-kit) / nats bridge as a service +- `cmd/collector/cmd/files.go:func initializeNATsandCollector(ctx context.Context, natsAddr string) {` + - `// TODO: pass in credentials file for NATS secure login` +- GAUC main: e1c30a68ea4b6fc7ccf804f9418c55662d4a968b +- rad master: fc724a2333040ced38f87aa4c70149ffab095bb7 +- We need to find the place where `rad` is communicating and hook that to dump all the events + - Then we'll translate into the friendly forge / forgejo / forgeflow / federated forge event space + - Then we'll have GUAC ingest it + +```console +$ git grep node +$ cat radicle-cli/examples/rad-node.md +$ ./target/release/radicle-node --help +2023-03-30T16:23:18.376-07:00 INFO node Starting node.. + +Usage + + radicle-node [