tag:beta.briefideas.org,2005:/trending?page=38Journal of Brief Ideas: Ideas from the last week2017-05-05T12:01:21Ztag:beta.briefideas.org,2005:Idea/3812017-05-05T12:01:21Z2017-05-06T06:01:02ZAutism microglia researchhttps://doi.org/10.5281/zenodo.572267I am forming a group to submit a proposal to an NIH GRANT regarding Autism. The GRANT is R0. The deadline is 09/2019. The Main idea is state of microglia on autism. The theories behind area 2: white matter changes based on previous studies and microbiome dysbioses observed in autism. There is a link between microbiome changes, histamine and microglia that i would like to study. The group so far is composed by a pediatrics neurologist, epidemiologist and myself a pediatrician. We are currently working on the proposal draft. We are looking for someone with expertise on neuroscience specially on autism and nuclear medicine in autism for brain imaging studies. The GRANT link is the following: https://grants.nih.gov/grants/guide/pa-files/PA-16-388.htmlHerrera Morban, Demian Arturotag:beta.briefideas.org,2005:Idea/3752017-03-28T07:40:57Z2017-03-30T06:00:24ZEpidemiology of musculoskeletal disorders and injuries among patients in a tertiary care public hospital.https://doi.org/10.5281/zenodo.439139
Musculoskeletal disorders are a major cause of morbidity, influence health and quality of life and impose an enormous burden on our healthcare system.
Injury is now a leading cause of mortality and morbidity worldwide. Injuries on roads, at home and in work place have progressively increased reflecting a lack of safety-related policies and programs or difficulties in implementation of the same. We are passing through significant urbanization, industrialization and a change in the socio-economic values. Due to these changes, the number of automobiles on our roads is rapidly increasing. Besides causing pollution, this has lead to road traffic accidents becoming the first public hazard in the world.
Trauma is an increasingly significant problem in India, and socio-economic and cultural changes are resulting in alterations in the epidemiology of trauma here also.
Besides other reasons, delayed presentation and neglect is a very important factor, and this is responsible for poor outcome of musculoskeletal trauma, and neoplasia in particular, which as such carry a poor prognosis in many cases.
This study aims to assess the epidemiology of musculoskeletal problems and injuries among patients in a tertiary care public hospital in India. Patralekh, Dr Mohit Kumartag:beta.briefideas.org,2005:Idea/3732017-03-26T23:00:03Z2017-03-27T06:00:22ZHuman oral microbiome crisis at the end of 18th c.?https://doi.org/10.5281/zenodo.438198Scholars have recently shown a clear increase in the frequency of caries and tooth loss for humans after the end of the 18th c. [1] For the authors, changes in nutrition (more sugar) and dental health (possibly higher frequency of tooth extraction) could be the underlying factors which led to this minor to moderate shift of dental disease frequencies in Europe. In fact, other major factors may have play a key-role at this period [2], causing a modification of oral microbiome, especially at the level of the dental plaque: introduction of first vaccinations (smallpox), apparition and/or increase of the consummation of new aliments (potatoes, coffee, chocolate, tomatoes, maize, etc.) or drugs (tobacco), beginning of the industrial revolution (with subsequent changes in the time and mode of work, and the adaptation of human biology to it). Further metagenomic analysis of dental calculus will be necessary to investigate any significant modification of the oral microbiome that could confirm or not this hypothesis, comparable to the procedures used for the evolutionary stages of pre-humans, but on a wider scale.
References
1. Müller A, Hussein K. Meta-analysis of teeth from European populations before and after the 18th century reveals a shift towards increased prevalence of caries and tooth loss. Arch Oral Biol. 2017;73:7-15.
2. Andrews K. History of medicine: Health, medicine and disease in the eighteenth century. Br J 18th Cent Stud. 2011; 34: 503-15.Charlier, Philippetag:beta.briefideas.org,2005:Idea/3672017-02-22T02:48:45Z2017-02-27T06:00:45ZThe Applications of Smart Materials in Hydraulic Machineries: Combination of Piezoelectricity and Bubbleshttps://doi.org/10.5281/zenodo.322652Due to the secondary Bjerknes force, the bubbles migrate towards the wall, generate the toroidal shape, and then form high-speed jet, which causes high impulsive pressure [[1]](https://www.cambridge.org/core/journals/journal-of-fluid-mechanics/article/div-classtitleexperimental-and-numerical-investigation-of-the-dynamics-of-an-underwater-explosion-bubble-near-a-resilientrigid-structurediv/AA698142CB83B24F8535361E7D58A131). Jacques, a French physicist, discovered the piezoelectricity in 1880 [[2]] (http://www.science20.com/profile/news_staff). The combination of piezoelectricity and bubbles can be considered as a smart material to solve harmful cavitation phenomenon in hydraulic machineries.
The impulsive pressure of bubble collapse applied on the piezoelectric materials can induce the internal generation of electrical charge, which causes electric potential difference. Based on this direct piezoelectric effect, the cavitation bubbles will be real-timely detected in intensity and magnitude. The system of collapsing bubble – piezoelectricity is treated as a trigger switch of pressure or feed-back system.
On the other hand, the shape of piezoelectric material can be changed by inputting alternating current, due to its inverse piezoelectric effect. As known, bubble migration is closely influenced by deformable and elastic boundaries [[3]](https://www.cambridge.org/core/journals/journal-of-fluid-mechanics/article/div-classtitledynamics-of-laser-induced-cavitation-bubbles-near-an-elastic-boundarydiv/51F6F1756B9FBE286212289079C7AE39). So it’s possible to actively control the direction of bubble migration and then eliminate cavitation by changing alternative current.
The application of piezoelectricity and bubbles will also be a perfect combination in the ultrasonic cleaning [[4]] (http://www.sciencedirect.com/science/article/pii/S1350417716301699), underwater explosion [[5]](https://www.cambridge.org/core/journals/journal-of-fluid-mechanics/article/div-classtitleexperimental-study-on-bubble-dynamics-subject-to-buoyancydiv/3CA7300075C42FC0161E5D9754C1DD35), and medical therapeutics [[6]] (http://www.sciencedirect.com/science/article/pii/S1350417716302917), etc.Ma, XiaojianHuang, BiaoWang, Guoyutag:beta.briefideas.org,2005:Idea/3592017-01-23T06:22:05Z2017-01-25T06:00:50Zexchange of value between USB devices, bitcoinhttps://doi.org/10.5281/zenodo.259516BOUNDER ##design0.1##
One way 2 secure USB devices UTXO exchange /hardware/
Introduction.
While bitcoin and other cryptocurrencies are fairly successful they suffer from the weakness that they need the world wide web or internet to perform any transactions. The algorithm design I propose is of-chain value exchange between peers in a chain like matter. This design would include one Atmel crypto chip and rewrite 2 USB drivers 'sector1' and 'sector2'.
Configuration on USB drive.
The problem of exchanging value stored offline facto, any one can create a set of UTXO
and claim that they are unspent. One way to solve this is that one exchange validated your set of UTXO. The set of UTXO would be per Satoshi and a Secret pin is given to the owner of the USB device. The design of USB driver Sector 1 asks for validation of every Satoshi spent '' pin A ''. Driver sector 2 could ask for a key unique to this USB drive validating the exchange is actually yours. If the exchange validated the set of UTXO with the embedded unique key of the USB device in a signature added to the UTXO set and written that value to the USB device. Adding 'pin A' to the UTXO set confirms this is yours to spend. So Imagine a change of value signed bye a set of signatures happening of line and in a secure way.
Algorithm design device X <=> Y
sector 1 USB Driver1
boot USB 1 and generate key X1
boot USB 2 and generate key Y1
'PIN A' creating multi signature XY1
.lib1
multi signature is divided and control bits are added
('loop back sector 1' ' add sequence signature sector 2 and first checkpoint bit is timed)
https://atmelcorporation.wordpress.com/2016/02/22/atmel-launches-the-industrys-first-hardware-interface-library-for-tls-stacks-used-in-iot-edge-node-apps/
sector 2 USB driver2
USB 1 generate key X2
USB 2 generate key Y2
check ' unique key any USB device ' validating UTXO set
use multi signature 2
checkpoint value Z time bit 'Atmel time checkpoint' and all is validated.
sequence Z wrong bit init. key generation and peers add value to the exchange.
At this point one could loop the entire process and validate the information / value exchanged.
The value of this algorithm lays with the loop sequences of the circuit where time ''time to hop'' is the MFA, and peers committed to the exchange of data.
If all checks out the data is exchanged by driver sector1 and driver sector2.
Hardware.
I imagine this hardware device is Solar powered with a battery able to produce 900mA 5V with led screen and 3 push buttons would suffice for entering any value.
###example USB bounder in pseudo-code
1#sector1 USB driver1 {
start usb1;
check validating 'Pin A' with UTXO set;
if validating entire set = true
sign multi signatureAX1;
}
2#sector1 USB driver2 {
start usb2;
check validating 'Pin A'(ore pin usb2) with utxo set;
if validating entire set = true
sign multi signatureAY1;
}
3#.lib1 {
generate AXAY1;
confirm looptime driver 1 driver 2 with added AX1 and AY2 (count bits = time);
add crypto with Atmel chip ''blinded(unlock key = counted bit string/amount and sequence of bits)''
add lib1 + blinded key to the cirquit;
blind unique signing key Z;
}
1/2/3 need one active loop with time/bit check against atmel chip crypto (otherwise one could just replace the usb device)
4#sector2 usb driver2 {
confirm Unique key from AX1 with .lib1 clone;
confirm Unique key from AY1 with .lib1 clone;
input value with push button;
signing AXAY1 and remove UTXO from .lib1 sequence;
sign new loop with atmel chip blinded crypto key;
}
5#validating string {
.lib1 is valid;
4sector2 usb driver2 is valid;
extracted value from clone = sector1 USB driver1 + sector1 USB driver2;
bitcount did change in the expected value;
atmel blinded key came from onbord chip;
release UTXO USB1 to USB2
}
6#MFA {
all was preformed within the expected bitcount;
if .lib1 + 4 + 5 = true;
write validating 'Pin A' receiver on exchanged UTXO set;
}
JohnnSebastian@protonmail.com
bitcoin wallet : 1261fKX1aEcEF7dgKyvkCgPwvcHNASwmnB
Sebastian, Johntag:beta.briefideas.org,2005:Idea/3572016-12-14T20:05:29Z2016-12-15T06:00:59ZReproducible Research with End-to-end Machine Inference Using Deep Learning and Bayesian Statisticshttps://doi.org/10.5281/zenodo.203086The conventional statistical inference based on hypothesis testing and *p*-value is fundamentally flawed. The general practice of data analysis involves too many post hoc decisions makings based on *p*-value, which unavoidably violates the assumptions of frequentist statistics, and worse, leaks to [*p*-hacking](http://pss.sagepub.com/lookup/doi/10.1177/0956797611417632) and ["garden-of-forking-paths"](http://www.stat.columbia.edu/~gelman/research/unpublished/p_hacking.pdf). It is especially true for data that requires multiple preprocessing steps. For example, it had been observed that [virtually no two fMRI papers contain identical analysis pipeline](http://www.sciencedirect.com/science/article/pii/S1053811912007057). Indeed, researchers often facing many arbitrary choices such as what kind of smoothing should be used, which algorithm and what parameter set should be applied as motion correction, etc.
With the advance in deep learning and computational Bayesian approach, it is easier than ever to fit (very) large model and apply the [Box's loop](http://www.annualreviews.org/eprint/7xbyci3nwAg5kEttvvjk/full/10.1146/annurev-statistics-022513-115657) to perform model criticism and inference. It is thus possible to apply an end-to-end inference framework, package the whole data analysis and statistical inference workflow into a [pipeline](http://scikit-learn.org/stable/modules/cross_validation.html). It allows researchers to quantitatively evaluate the human part in data analysis (e.g., different parameter setting), by directly modelling these arbitrary choices as hyperparameters in the model using Bayesian inferences. It is also possible to develop many pipelines with different models and parameters, then evaluated these pipelines using [cross-validation and prediction](http://scikit-learn.org/stable/tutorial/statistical_inference/model_selection.html), etc. Combined with open data policy, this framework can greatly improve research reproducibility. Lao, Junpengtag:beta.briefideas.org,2005:Idea/3472016-10-30T11:31:09Z2016-11-22T06:00:29ZApplying Supervised Machine Learning to the Fight against Online Child Sexual Abusehttps://doi.org/10.5281/zenodo.167646Some reports of NGOs and anecdotal evidence suggest that child abuse materials (CAMs) share some characteristics extensively. They mostly take place indoor settings, victims' face or genitalia is visible and there are few visual clues about the abuser(s).
For the known CAMs, there are methods and projects to detect and remove them automatically such as PhotoDNA of Microsoft Inc. and Baseline project of Interpol. However, detection of new CAMs still heavily relies on outdated and inefficient methods such as user reporting of ordinary people.
By coding specific attributes of every known CAM such as indoor/outdoor setting, on bed/couch/table, and visibility of face/breast/genitalia throughly, supervised machine learning can be used in the detection of new CAMs. As the number of coded known CAMs increases, the algorithm would be better at determining the features of known CAMs due to mentioned similarities of them. Since the databases of Interpol and NCMEC have millions of known CAMs, intervention of supervisors would decrease. Naturally, the algorithm also might be helpful to detect new CAMs to some extent after the inclusion of all known CAMs. Besides faster victim identifications, this would also decrease the spreading rate of new CAMs throughout the internet remarkably if it is successful.açar, kemal velitag:beta.briefideas.org,2005:Idea/3432016-10-26T06:35:18Z2016-10-29T06:00:25ZSearching for Dyson spheres with Gaiahttps://doi.org/10.5281/zenodo.163758Assuming that Dyson spheres obscure optical/near-infrared light as grey absorbers, a partial Dyson sphere with high covering fraction (fcov > 0.5) could reveal itself as an anomalously subluminous star with a parallax-based distance smaller than the spectroscopically inferred one. A large catalog search for objects of this type can currently be carried out by combining parallax distances from [Gaia DR1](https://arxiv.org/abs/1609.04303) with spectroscopic distances from [RAVE DR5](https://arxiv.org/abs/1609.03210). From the best-fitting stellar parameters provided by RAVE, the [PARSEC](http://stev.oapd.inaf.it/cgi-bin/cmd) isochrones on which the RAVE fits are based also provide the predicted [WISE](http://wise2.ipac.caltech.edu/docs/release/allsky/) fluxes for these stars. A comparison to the observed WISE flux may then reveal any infrared excess associated with waste heat emission from the Dyson sphere at wavelengths < 22 micron. Most of the ~200,000 stars which are common to both data sets have distance errors too large to be useful for this analysis, but searches among subsets of objects with the highest-quality data and fits could be fruitful. Further cuts can be made to exclude objects which are very young (potentially obscured by debris disks), have left the main sequence (potentially subject to circumstellar dust), or for which the stellar parameters favoured by the RAVE and [RAVE-on](https://arxiv.org/abs/1609.02914) analyses are inconsistent.Zackrisson, ErikWehrhahn, AnsgarKorn, Andreastag:beta.briefideas.org,2005:Idea/3422016-10-24T11:12:09Z2016-10-25T06:00:38ZThe simple observation by Huygens that disproved Newton's rule for the extraordinary refraction of calcitehttps://doi.org/10.5281/zenodo.162990Huygens (1629–1695), in his <a href="http://www.gutenberg.org/files/14725/14725-h/14725-h.htm"><i>Treatise on Light</i></a> (1690), explained the double refraction of calcite on the correct hypothesis that the secondary waves for the “extraordinary” refraction are <em>spheroidal</em> rather than spherical. His theory explained, <i>inter alia</i>, his simple observation that when a calcite crystal is placed on a horizontal page of text, the ordinary image appears higher than the extraordinary image (Ch.V, <a href="http://www.gutenberg.org/files/14725/14725-h/14725-h.htm#Page_81">Art. 39)</a>.
Newton, in Qu. 17 of the Latin edition of his <i>Opticks</i> (1706), or Qu. 25 (in <a href="http://www.newtonproject.sussex.ac.uk/view/texts/diplomatic/NATP00051">Bk. 3</a>) of later English editions, gave his own ‘rule’ for double refraction. Let a ray from an external point <i>L</i> strike one face of a calcite crystal at point <i>P</i>, and be refracted to point <i>O</i> on the opposite face by the <i>O</i>rdinary refraction, and to point <i>E</i> on the opposite face by the <i>E</i>xtraordinary refraction. According to Newton's rule, the displacement vector <i>OE</i> does not vary with the direction of the incident ray.
It is <a href="http://www.grputland.com/2016/10/observation-by-huygens-that-should-have-discredited-newtons-rule.html">easily shown</a> that if this rule were true, the two text images reported by Huygens would have been at the same height.
Newton's rule was disproven in 1788. That it had already been disproven by a single qualitative observation, before it was published, seems to have escaped notice.Putland, Gavintag:beta.briefideas.org,2005:Idea/3362016-09-18T13:47:38Z2016-09-19T06:00:20ZCarbon concentrations in phytoliths and carbon sequestration in soilhttps://doi.org/10.5281/zenodo.154351In recent years there has been much work on the possibility that carbon could be sequestered in phytoliths and contribute to solving global warming.<sup>[1](http://onlinelibrary.wiley.com/doi/10.1111/j.1365-2486.2009.02118.x/full), [2](http://link.springer.com/article/10.1007/s11368-016-1527-x)</sup> There have been a number of calculations of how much is sequestered globally. The crucial figure in these calculations is the concentration of carbon in the phytoliths that is used, so-called phytolith occluded carbon. These figures are derived from phytoliths that have been acid digested or dry ashed at 450-500<sup>o</sup>C. Values vary from less than 0.1% to 6% depending on the technique used. But are any of these measurements realistic? In native unprocessed material the lemma macrohair from *Phalaris canariensis* contained 40% silica, 55%, polysaccharides and 5% proteins.<sup>[3](http://www.sciencedirect.com/science/article/pii/S0176161787800287)</sup> At maturity these hairs consist entirely of wall material. I know of no data for lumen deposits, which undoubtedly have a higher percentage silica in the native state. Some of the carbon in both cell wall and lumen phytoliths cannot be accessed by hot acid or dry ashing. I suspect that more carbon is inaccessible to breakdown processes in soil than is estimated by our “occluded carbon” determinations. My question is whether these measurements are a good estimate of what is present in soil phytoliths?Hodson, Martin