Skip to content

SONAR: Self-Organizing Network of Aggregated Representations

Project by MIT Media Lab

A collaborative learning project where users self-organize to improve their ML models by sharing representations of their data or model.

To get started, please refer to the Get Started page.

Main

The application currently uses MPI and GRPC (experimental) to enable communication between different nodes in the network. The goal of the framework to organize everything in a modular manner. That way a researcher or engineer can easily swap out different components of the framework to test their hypothesis or a new algorithm.

Table 1: Performance overview (AUC) of various topologies with different number of collaborators.

Topology Train Test
1 2 3 1 2 3
Isolated 208.0(0.3) 208.0(0.3) 208.0(0.0) 44.5(6.9) 44.5(6.9) 44.5(6.9)
Central* 208.5(0.1) 208.5(0.0) 208.5(0.0) 33.97(14.2) 33.97(14.2) 33.97(14.2)
Random 205.4(1.0) 205.5(0.9) 205.9(0.8) 54.9(5.3) 56.0(5.8) 56.2(5.6)
Ring 198.8(3.1) 198.7(3.3) 198.7(3.4) 47.8(7.3) 46.9(6.9) 47.6(7.1)
Grid 202.6(1.5) 203.9(1.4) 204.5(1.3) 49.3(6.0) 48.8(6.0) 48.1(6.1)
Torus 202.0(1.2) 203.2(1.2) 204.0(1.3) 50.2(6.0) 50.7(6.6) 50.3(6.2)
Similarity based (top-k) 206.4(1.6) 197.6(7.3) 200.4(4.4) 47.3(8.3) 48.4(8.5) 52.8(7.2)
Swarm 183.2(3.5) 183.1(3.6) 183.2(3.6) 52.2(8.7) 52.3(8.7) 52.4(8.6)
L2C 167.0(25.4) 158.8(30.6) 152.8(35.2) 37.6(7.4) 36.6(7.4) 35.8(7.7)

Table 2 Area Under Curve of Test Accuracy Varying Number of Users

Num Users DomainNet (Within Domain) DomainNet (Random) Camelyon17 (Within Domain) Camelyon17 (Random) Digit-Five (Within Domain) Digit-Five (Random)
12 56.6267 50.1772 178.3622 145.6398 57.1168 68.9724
18 59.5647 54.2480 179.5941 145.9916 66.8201 69.8341
24 61.8006 54.3855 178.5976 149.2037 71.6536 72.5333
30 66.5896 58.4835 179.1761 153.0658 74.4239 72.6996
39 68.3743 59.6090 179.1404 149.8618 163.8116 163.9892
45 68.124 59.7852 180.0231 147.4649 77.0248 73.0634

Table 3 Area Under Curve of Test Accuracy Varying Number of Domains

AUC DomainNet (48 users, 200 rounds)

Num Domains Within Domain Random
2 67.7514 58.7947
4 61.5723 50.2906
6 69.4671 47.7867

AUC Camelyon17 (30 users, 200 rounds)

Num Domains Within Domain Random
2 179.7901 172.9167
3 179.1761 153.0658
5 176.5059 139.4547

AUC Digit-Five (30 users, 200 rounds) | Num Domains | Within Domain | Random | |-------------|---------------------|---------------| | 2 | 71.8536 | 65.6555 | | 3 | 74.4239 | 72.6996 | | 5 | 77.3709 | 76.3041 |

Table 4 Test Accuracy and Standard Deviation Over Rounds

DomainNet (39 users, 3 domains) | Rounds | Within Domain | | Random | | |--------|---------------|-----------|---------------|-----------| | | Mean | Std | Mean | Std | | 100 | 0.3619 | 0.0635 | 0.3212 | 0.0625 | | 200 | 0.4220 | 0.0563 | 0.3733 | 0.0791 | | 300 | 0.4362 | 0.0498 | 0.4203 | 0.0537 | | 400 | 0.4353 | 0.0687 | 0.4355 | 0.0585 | | 500 | 0.4726 | 0.0502 | 0.4499 | 0.0496 |

Camelyon17 (39 users, 3 domains)

Rounds Within Domain Random
Mean Std Mean Std
40 0.9086 0.0255 0.7281 0.1405
80 0.9169 0.0251 0.7460 0.1196
120 0.9329 0.0195 0.7293 0.1520
160 0.9361 0.0239 0.8122 0.1346
200 0.9353 0.0251 0.7762 0.1516

Digit-Five (39 users, 3 domains)

Rounds Within Domain Random
Mean Std Mean Std
20 0.7314 0.1290 0.6788 0.0839
40 0.8080 0.0974 0.8151 0.0549
60 0.8350 0.0937 0.8464 0.0558
80 0.8417 0.0928 0.8673 0.0454
100 0.8310 0.1030 0.8733 0.0502