Second Dream

Online Time Series Deep Dreaming

Configuration

Model
Base time series
Dream

Gradient ascent step size.

Number of consecutive scales at which to run gradient ascent. We first scale the base series down to the lowest scale and run gradient ascent there. Next, we scale the resulting series up to the next higher scale and run gradient ascent again. We continue until we reach the highest scale, which is just the original size of the time series.

Size ratio between scales.

Number of gradient ascent steps per scale.

When, during a scale, we reach a loss this high, the rest of the scale is skipped and we proceed with the subsequent scale.

Each line adds one layer whose L2 activation we maximize, alongside a relative weight. Because the sum of all squared activations of the neuron in the layer is maximized, few high activations will win over lots of small activations.
dense_1 0.5 maximizes the layer dense_1 with weight 0.5.
dense_1 #0:1 #3:1.5 only maximizes neurons 0 and 3 from the layer dense_1 with weights 1 and 1.5, respectively.
Note that depending on the shape of the layer, you may need more coordinates to identify a single neuron. For example, #5,8:1.5 assigns weight 1.5 to neuron 5,8.

Made with ♥ by Felix Mujkanovic (LoadingByte).
Code is available on GitHub.

Model Visualization

G 139671915435064 input_1: InputLayer input: output: (None, 150, 1) (None, 150, 1) 139671915435176 conv1d_1: Conv1D input: output: (None, 150, 1) (None, 150, 64) 139671915435064->139671915435176 139671915437024 conv1d_4: Conv1D input: output: (None, 150, 1) (None, 150, 64) 139671915435064->139671915437024 139671915435232 batch_normalization_1: BatchNormalization input: output: (None, 150, 64) (None, 150, 64) 139671915435176->139671915435232 139671915435624 activation_1: Activation input: output: (None, 150, 64) (None, 150, 64) 139671915435232->139671915435624 139671915436296 conv1d_2: Conv1D input: output: (None, 150, 64) (None, 150, 64) 139671915435624->139671915436296 139671915436352 batch_normalization_2: BatchNormalization input: output: (None, 150, 64) (None, 150, 64) 139671915436296->139671915436352 139671915436744 activation_2: Activation input: output: (None, 150, 64) (None, 150, 64) 139671915436352->139671915436744 139671915437080 conv1d_3: Conv1D input: output: (None, 150, 64) (None, 150, 64) 139671915436744->139671915437080 139671915437472 batch_normalization_4: BatchNormalization input: output: (None, 150, 64) (None, 150, 64) 139671915437024->139671915437472 139671915437864 batch_normalization_3: BatchNormalization input: output: (None, 150, 64) (None, 150, 64) 139671915437080->139671915437864 139671898153368 add_1: Add input: output: [(None, 150, 64), (None, 150, 64)] (None, 150, 64) 139671915437472->139671898153368 139671915437864->139671898153368 139671898153424 activation_3: Activation input: output: (None, 150, 64) (None, 150, 64) 139671898153368->139671898153424 139671898153480 conv1d_5: Conv1D input: output: (None, 150, 64) (None, 150, 128) 139671898153424->139671898153480 139671898154936 conv1d_8: Conv1D input: output: (None, 150, 64) (None, 150, 128) 139671898153424->139671898154936 139671898153872 batch_normalization_5: BatchNormalization input: output: (None, 150, 128) (None, 150, 128) 139671898153480->139671898153872 139671898154152 activation_4: Activation input: output: (None, 150, 128) (None, 150, 128) 139671898153872->139671898154152 139671898154208 conv1d_6: Conv1D input: output: (None, 150, 128) (None, 150, 128) 139671898154152->139671898154208 139671898154600 batch_normalization_6: BatchNormalization input: output: (None, 150, 128) (None, 150, 128) 139671898154208->139671898154600 139671898154880 activation_5: Activation input: output: (None, 150, 128) (None, 150, 128) 139671898154600->139671898154880 139671898155328 conv1d_7: Conv1D input: output: (None, 150, 128) (None, 150, 128) 139671898154880->139671898155328 139671898155720 batch_normalization_8: BatchNormalization input: output: (None, 150, 128) (None, 150, 128) 139671898154936->139671898155720 139671898156000 batch_normalization_7: BatchNormalization input: output: (None, 150, 128) (None, 150, 128) 139671898155328->139671898156000 139671898156280 add_2: Add input: output: [(None, 150, 128), (None, 150, 128)] (None, 150, 128) 139671898155720->139671898156280 139671898156000->139671898156280 139671898156336 activation_6: Activation input: output: (None, 150, 128) (None, 150, 128) 139671898156280->139671898156336 139671898156392 conv1d_9: Conv1D input: output: (None, 150, 128) (None, 150, 128) 139671898156336->139671898156392 139671898174688 batch_normalization_12: BatchNormalization input: output: (None, 150, 128) (None, 150, 128) 139671898156336->139671898174688 139671898156784 batch_normalization_9: BatchNormalization input: output: (None, 150, 128) (None, 150, 128) 139671898156392->139671898156784 139671915438032 activation_7: Activation input: output: (None, 150, 128) (None, 150, 128) 139671898156784->139671915438032 139671898173568 conv1d_10: Conv1D input: output: (None, 150, 128) (None, 150, 128) 139671915438032->139671898173568 139671898173960 batch_normalization_10: BatchNormalization input: output: (None, 150, 128) (None, 150, 128) 139671898173568->139671898173960 139671898174240 activation_8: Activation input: output: (None, 150, 128) (None, 150, 128) 139671898173960->139671898174240 139671898174296 conv1d_11: Conv1D input: output: (None, 150, 128) (None, 150, 128) 139671898174240->139671898174296 139671898174968 batch_normalization_11: BatchNormalization input: output: (None, 150, 128) (None, 150, 128) 139671898174296->139671898174968 139671898175248 add_3: Add input: output: [(None, 150, 128), (None, 150, 128)] (None, 150, 128) 139671898174688->139671898175248 139671898174968->139671898175248 139671898175304 activation_9: Activation input: output: (None, 150, 128) (None, 150, 128) 139671898175248->139671898175304 139671898175360 global_average_pooling1d_1: GlobalAveragePooling1D input: output: (None, 150, 128) (None, 128) 139671898175304->139671898175360 139671898175472 dense_1: Dense input: output: (None, 128) (None, 2) 139671898175360->139671898175472