torsdag 29 december 2016

Why is there no antimatter in the Universe?

This question is widely discussed in the physics community with no answer yet presented. It was recently demonstrated that Hydrogen and anti-Hydrogen have the exact same excitation spectra, further strengthen the standard model in this respect.

So here is a wild (and as usualy not very firmly grounded since I don't know very much about this) suggestion to the assymetry.
Maybe there is some kind of process that can oscillate between matter and antimatter. Currently the universe is in the matter oscillation stage. Maybe there is a gradual process towards antimatter or maybe the big bang was initiated from the antimatter state some how. That is, e.g. the big bag was preceeded by an anti matter bug crunch.

Another thought regarding this is, since there is no difference in the spectrum, how can we be sure that the other galaxies, or galaxy clusters, actually consists of matter? Maybe actually they consists of antimatter and everything is much more symmetric then we think. But a quick googling indicates that this is unlikely.

tisdag 13 december 2016

Sensor OSI-model

Introduction

In communication there is the famous OSI network stack model. However, to my knowledge, there is no such thing for sensors. So, here is my proposition for such a model:
6. Decision layer
5. Fusion layer
4. Distribution layer
3. Classification layer
2. Detection layer
1. Physical layer

Layer descriptions

The function of these follows

Physical layer

This is very similar to the physical layer in the OSI model and consists of the hardware raw data produced by the hardware.

Detection layer

The function of this layer is to determine when the sensor gets actuated. Some sensor like for example a temperature sensor probably gets activated all the time, as long as it is on. But others, like radars and image sensors might only get readings of at certain time instances.

Classification layer

There is often a need for a sensor to classify the detections in one or another way. That can for example be a face recognition algorithm in an image sensor, or a tracking filter in a radar or lidar sensor.

Distribution layer

In this layer the sensor data is packages (typically using some communication protocol) and distributed to the unit that is processing the sensor data. At this stage, the sensor data is converted to some object format. Maybe time stamps or accuracy information is bundled with the actual read data as well.

Fusion layer

At this layer, data from all sensors is fused to form a coherent picture of the sensed scenario. This can for example be GPS and Lidar data in an autonomous car and the processing stage is then denoted an occupancy grid.

Decision layer

Once all data is fused into a complete picture - a decision layer processes it to determine how to act on the sensor data.

Conclusion

This is my draft of how a layer model for sensors could be formed. Would be very interesting to hear any comments on it.