Difference between revisions of "Encrypted code: Ephaptic transmission"

no edit summary
 
Line 8: Line 8:
| autore3 =Giorgio Cruccu  
| autore3 =Giorgio Cruccu  
}}
}}
'''Abstract:''' In this chapter, we explore the cognitive process and methodology used to diagnose a complex neuromotor condition, Hemimasticatory Spasm, in a patient referred to as "Mary Poppins." Despite the apparent simplicity of the diagnosis, the complexity of biological systems and the limitations of traditional deterministic medical approaches led to a decade-long delay in identifying the correct condition. This case underscores the need for advanced diagnostic models capable of integrating multiple variables and probabilistic approaches.


== Introduction==
We introduce the Cognitive Neural Network (CNN), a clinician-guided model designed to support diagnosis by iteratively analyzing input data and generating context-based outputs. Unlike machine learning models, the CNN is not reliant on pre-trained data but instead adapts based on real-time cognitive processes initiated by the clinician, making it more suited for complex, real-world clinical environments. The CNN model assists the clinician in decrypting the Central Nervous System's (CNS) encrypted signals by guiding the diagnostic process through a series of iterative cognitive steps, ultimately leading to the identification of the patient's condition.
 
Through this model, the chapter demonstrates how integrating a dynamic neural network model with human clinical expertise can overcome the limitations of classical probability models, such as those anchored in verbal symptoms and deterministic approaches. The chapter highlights the importance of quantum probability in understanding complex biological systems and discusses how the CNN, guided by key elements such as the "<math>\tau</math> Coherence Demarcator," helps to differentiate between competing diagnostic hypotheses.
 
Using the Hemimasticatory Spasm case as an example, the CNN is presented as a tool that leverages both cognitive intuition and structured data analysis to produce a final, accurate diagnosis. This model lays the groundwork for broader applications in clinical practice, where cognitive processes are central to handling complex diagnostic challenges.
==Introduction==
In the chapter '[[1° Clinical case: Hemimasticatory spasm - en|1st Clinical case: Hemimasticatory spasm]]' we immediately reached a conclusion bypassing all the cognitive, clinical and scientific process which underlies the diagnostic definition but it is not that simple otherwise our poor patient Mary Poppins, would not have had to wait 10 years for the correct diagnosis.<blockquote>It should be emphasized that it is not a question of negligence on the part of clinicians rather of the complexity of 'biological systems' and above all of a mindset still anchored to a 'classical probability'. The 'Classical probability' categorizes healthy and diseased phenotypes according to symptoms and signs sampled clinicians instead of probing the 'State' of the system in the temporal evolution. This concept, anticipated in the chapter '[[Logic of medical language: Introduction to quantum-like probability in the masticatory system]]' and in '[[Conclusions on the status quo in the logic of medical language regarding the masticatory system]]' has laid the foundations for a medical language more articulated and less deterministic, mainly focused on the 'State' of the 'Mesoscopic System' whose purpose is, essentially, to decrypt the message in machine language generated by the Central Nervous System. We will assist in the description of other clinical cases that will be reported in the next Masticationpedia chapters. </blockquote>This model, which we propose with the term 'Cognitive Neural Network' abbreviated as 'CNN' is a dynamic cognitive intellectual process of the clinician who interrogates the network for self-training. The 'CNN' is not a 'Machine Learning' because while the latter must be trained by the clinician, with statistical and prediction adjustments, the 'CNN' trains the clinician or rather directs the clinician to the diagnosis while always being questioned following a logical human, hence the term 'Cognitive'.
In the chapter '[[1° Clinical case: Hemimasticatory spasm - en|1st Clinical case: Hemimasticatory spasm]]' we immediately reached a conclusion bypassing all the cognitive, clinical and scientific process which underlies the diagnostic definition but it is not that simple otherwise our poor patient Mary Poppins, would not have had to wait 10 years for the correct diagnosis.<blockquote>It should be emphasized that it is not a question of negligence on the part of clinicians rather of the complexity of 'biological systems' and above all of a mindset still anchored to a 'classical probability'. The 'Classical probability' categorizes healthy and diseased phenotypes according to symptoms and signs sampled clinicians instead of probing the 'State' of the system in the temporal evolution. This concept, anticipated in the chapter '[[Logic of medical language: Introduction to quantum-like probability in the masticatory system]]' and in '[[Conclusions on the status quo in the logic of medical language regarding the masticatory system]]' has laid the foundations for a medical language more articulated and less deterministic, mainly focused on the 'State' of the 'Mesoscopic System' whose purpose is, essentially, to decrypt the message in machine language generated by the Central Nervous System. We will assist in the description of other clinical cases that will be reported in the next Masticationpedia chapters. </blockquote>This model, which we propose with the term 'Cognitive Neural Network' abbreviated as 'CNN' is a dynamic cognitive intellectual process of the clinician who interrogates the network for self-training. The 'CNN' is not a 'Machine Learning' because while the latter must be trained by the clinician, with statistical and prediction adjustments, the 'CNN' trains the clinician or rather directs the clinician to the diagnosis while always being questioned following a logical human, hence the term 'Cognitive'.


Line 18: Line 24:
In essence, the encrypted machine language message sent out by the Central Nervous System in the 10 years of illness of our patient Mary Poppins was interpreted, through verbal language, as Orofacial Pain from Temporomandibular Disorders'. We have remarked several times, however, that human verbal language is distorted by vagueness and ambiguity therefore, not being a formal language such as mathematical language, it can generate diagnostic errors. The message in machine language sent out by the Central Nervous System to be searched for is not pain (pain is a verbal language) but the 'Anomaly of System State' in which the organism was in that time period. Hence the shift from the semiotics of the symptom and the clinical sign to the '[[System logic|System Logic]]' which, through 'System's Theory' models, quantify the system's responses to incoming stimuli, even in healthy subjects.
In essence, the encrypted machine language message sent out by the Central Nervous System in the 10 years of illness of our patient Mary Poppins was interpreted, through verbal language, as Orofacial Pain from Temporomandibular Disorders'. We have remarked several times, however, that human verbal language is distorted by vagueness and ambiguity therefore, not being a formal language such as mathematical language, it can generate diagnostic errors. The message in machine language sent out by the Central Nervous System to be searched for is not pain (pain is a verbal language) but the 'Anomaly of System State' in which the organism was in that time period. Hence the shift from the semiotics of the symptom and the clinical sign to the '[[System logic|System Logic]]' which, through 'System's Theory' models, quantify the system's responses to incoming stimuli, even in healthy subjects.


All this conceptuality is replicated in the proposed 'CNN' model by dividing the process into incoming triggers (Input) and outgoing data (Output) to then be reiterated in a loop managed cognitively by the clinician up to the generation of a single node useful for the definitive diagnosis. The model basically breaks down as follows:  
All this conceptuality is replicated in the proposed 'CNN' model by dividing the process into incoming triggers (Input) and outgoing data (Output) to then be reiterated in a loop managed cognitively by the clinician up to the generation of a single node useful for the definitive diagnosis. The model basically breaks down as follows:


*'''Input:''' By incoming trigger, we mean the cognitive process that the clinician implements as a function of the considerations received from previous statements, as has been pointed out in the chapters concerning the 'Medical language logic'. In our case, through the '<math>\tau</math> Consistency Demarcator, the neurological context was defined as suitable instead of the dental one pursuing a clinical diagnostic explanation of TMDs.  
*'''Input:''' By incoming trigger, we mean the cognitive process that the clinician implements as a function of the considerations received from previous statements, as has been pointed out in the chapters concerning the 'Medical language logic'. In our case, through the '<math>\tau</math> Consistency Demarcator, the neurological context was defined as suitable instead of the dental one pursuing a clinical diagnostic explanation of TMDs.
**This trigger is of essential importance because it allows the clinician to point out the network analysis 'Initialization command' which will connect a large sample of data corresponding to the set trigger. To this essential 'Initialization command', as an algorithmic decryption key, is added the last closing command which is equally important as it depends on the intuition of the clinician who will consider the decryption process finished.
** This trigger is of essential importance because it allows the clinician to point out the network analysis 'Initialization command' which will connect a large sample of data corresponding to the set trigger. To this essential 'Initialization command', as an algorithmic decryption key, is added the last closing command which is equally important as it depends on the intuition of the clinician who will consider the decryption process finished.
** In Figure 1 the structure of the 'CNN' is represented where the difference between the most common neural network structures can be noted and in which the first stage is structured with a high number of input variables. In our 'CNN' the first stage corresponds only to a node and precisely to the network analysis of 'Initialization command' called                    ' <math>\tau</math> Coherence Demarcator', the subsequent loops of the network, which allow the clinician to terminate or to reiterate the network, (1st loop open, 2st loop open,...... nst loop open) are decisive for concluding the decryption process ( Decrypted Code ). This step will be explained in more detail later in the chapter.
** In Figure 1 the structure of the 'CNN' is represented where the difference between the most common neural network structures can be noted and in which the first stage is structured with a high number of input variables. In our 'CNN' the first stage corresponds only to a node and precisely to the network analysis of 'Initialization command' called                    ' <math>\tau</math> Coherence Demarcator', the subsequent loops of the network, which allow the clinician to terminate or to reiterate the network, (1st loop open, 2st loop open,...... nst loop open) are decisive for concluding the decryption process ( Decrypted Code ). This step will be explained in more detail later in the chapter.
[[File:Immagine 17-12-22 alle 11.34.jpeg|center|500x500px|'''Figure 1:'''Graphical representation of the 'CNN' proposed by Masticationpedia|thumb]]
[[File:Immagine 17-12-22 alle 11.34.jpeg|center|500x500px|'''Figure 1:'''Graphical representation of the 'CNN' proposed by Masticationpedia|thumb]]
Line 52: Line 58:
+2\sum_{\alpha_1<\alpha_2}\cos\theta_{\alpha_1\alpha_2}\sqrt{P(A=\alpha_1)P(B=\beta|A=\alpha_1)} P(A=\alpha_2) P(B=\beta|a=\alpha_2)<blockquote>Just as the lack of part of the binary code corrupts the representation of the formula, similarly the decryption of the machine language of the CNS is a source of vagueness and ambiguity of the verbal language and contextually of diagnostic error.</blockquote>
+2\sum_{\alpha_1<\alpha_2}\cos\theta_{\alpha_1\alpha_2}\sqrt{P(A=\alpha_1)P(B=\beta|A=\alpha_1)} P(A=\alpha_2) P(B=\beta|a=\alpha_2)<blockquote>Just as the lack of part of the binary code corrupts the representation of the formula, similarly the decryption of the machine language of the CNS is a source of vagueness and ambiguity of the verbal language and contextually of diagnostic error.</blockquote>


===Cognitive process ===
===Cognitive process===
----The heart of the 'CNN' model lies in the cognitive process referred exclusively to the clinician who is at the helm while the network essentially remains the compass that warns of off course and/or suggests other alternative routes but the decision-making responsibility always refers to the clinician ( human mind). In this simple definition, we will perceive it better at the end of the chapter, the synergism 'Neural network' and 'Human cognitive process' of the clinician will be self-implementing because, on the one hand, the clinician is trained or better guided by the neural network (database), while, the last one will be trained on the latest updated scientific-clinical event. Basically, the definitive diagnosis will add an additional piece of information to the temporal base knowledge <math>Kb_t</math>. This model differs substantially from 'machine learning' just by observing the two models in their structural configuration (Figures 1 and 3).
----The heart of the 'CNN' model lies in the cognitive process referred exclusively to the clinician who is at the helm while the network essentially remains the compass that warns of off course and/or suggests other alternative routes but the decision-making responsibility always refers to the clinician ( human mind). In this simple definition, we will perceive it better at the end of the chapter, the synergism 'Neural network' and 'Human cognitive process' of the clinician will be self-implementing because, on the one hand, the clinician is trained or better guided by the neural network (database), while, the last one will be trained on the latest updated scientific-clinical event. Basically, the definitive diagnosis will add an additional piece of information to the temporal base knowledge <math>Kb_t</math>. This model differs substantially from 'machine learning' just by observing the two models in their structural configuration (Figures 1 and 3).
[[File:Joim12822-fig-0004-m.jpeg|alt=|left|thumb|200x200px|'''Figure 3:''' Graphic representation of an archetypal ANN in which it can be seen in the first stage of initialization where there are five input nodes<ref name=":1">G S Handelman, H K Kok, R V Chandra, A H Razavi, M J Lee, H Asadi. eDoctor: machine learning and the future of medicine.J Intern Med.2018 Dec;284(6):603-619.doi: 10.1111/joim.12822. Epub 2018 Sep 3.</ref> while in the 'CNN' model the first stage is composed of only one node. Follow text. ]]Figure 3 shows a typical neural network, also known as artificial NNs. These artificial NNs attempt to use multiple layers of calculations to mimic the concept of how the human brain interprets and draws conclusions from information.<ref name=":1" /> NNs are essentially mathematical models designed to handle complex and disparate information, and this algorithm's nomenclature comes from its use of synapse-like "nodes" in the brain.<ref>Schwarzer G, Vach W, Schumacher M. On the misuses of artificial neural networks for prognostic and diagnostic classification in oncology. Stat Med 2000; 19: 541–61.</ref> The learning process of a NN can be supervised or unsupervised. A neural network is said to learn in a supervised manner if the desired output is already targeted and introduced into the network by data training, while, unsupervised NN has no such pre-identified target outputs and the goal is to group similar units close together in certain areas of the range of values. The supervised module takes data (e.g., symptoms, risk factors, imaging and laboratory findings) for training on known outcomes and searches for different combinations to find the most predictive combination of variables. NN assigns more or less weight to certain combinations of nodes to optimize the predictive performance of the trained model.<ref>Abdi H. A neural network primer. J Biol Syst 1994; 02: 247–81.</ref>         
[[File:Joim12822-fig-0004-m.jpeg|alt=|left|thumb|200x200px|'''Figure 3:''' Graphic representation of an archetypal ANN in which it can be seen in the first stage of initialization where there are five input nodes<ref name=":1">G S Handelman, H K Kok, R V Chandra, A H Razavi, M J Lee, H Asadi. eDoctor: machine learning and the future of medicine.J Intern Med.2018 Dec;284(6):603-619.doi: 10.1111/joim.12822. Epub 2018 Sep 3.</ref> while in the 'CNN' model the first stage is composed of only one node. Follow text. ]]Figure 3 shows a typical neural network, also known as artificial NNs. These artificial NNs attempt to use multiple layers of calculations to mimic the concept of how the human brain interprets and draws conclusions from information.<ref name=":1" /> NNs are essentially mathematical models designed to handle complex and disparate information, and this algorithm's nomenclature comes from its use of synapse-like "nodes" in the brain.<ref>Schwarzer G, Vach W, Schumacher M. On the misuses of artificial neural networks for prognostic and diagnostic classification in oncology. Stat Med 2000; 19: 541–61.</ref> The learning process of a NN can be supervised or unsupervised. A neural network is said to learn in a supervised manner if the desired output is already targeted and introduced into the network by data training, while, unsupervised NN has no such pre-identified target outputs and the goal is to group similar units close together in certain areas of the range of values. The supervised module takes data (e.g., symptoms, risk factors, imaging and laboratory findings) for training on known outcomes and searches for different combinations to find the most predictive combination of variables. NN assigns more or less weight to certain combinations of nodes to optimize the predictive performance of the trained model.<ref>Abdi H. A neural network primer. J Biol Syst 1994; 02: 247–81.</ref>         
Line 64: Line 70:
But let's see in detail how a 'CNN' is built
But let's see in detail how a 'CNN' is built


== Cognitive Neural Network==
==Cognitive Neural Network==
In this paragraph it seems necessary to explain the clinical process followed with the support of the 'CNN' following step by step the cognitive queries to the network and the cognitive analysis performed on the data in response from the network. The map has also been shown in figure 4 with links to the network responses that can be viewed for more consistent documentation:
In this paragraph it seems necessary to explain the clinical process followed with the support of the 'CNN' following step by step the cognitive queries to the network and the cognitive analysis performed on the data in response from the network. The map has also been shown in figure 4 with links to the network responses that can be viewed for more consistent documentation:


Editor, Editors, USER, admin, Bureaucrats, Check users, dev, editor, founder, Interface administrators, member, oversight, Suppressors, Administrators, translator
11,073

edits