Dissertations / Theses: 'Alpha-information' – Grafiati (2024)

  • Bibliography
  • Subscribe
  • News
  • Referencing guides Blog Automated transliteration Relevant bibliographies by topics

Log in

Українська Français Italiano Español Polski Português Deutsch

We are proudly a Ukrainian website. Our country was attacked by Russian Armed Forces on Feb. 24, 2022.
You can support the Ukrainian Army by following the link: https://u24.gov.ua/. Even the smallest donation is hugely appreciated!

Relevant bibliographies by topics / Alpha-information / Dissertations / Theses

To see the other types of publications on this topic, follow the link: Alpha-information.

Author: Grafiati

Published: 1 June 2024

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Consult the top 24 dissertations / theses for your research on the topic 'Alpha-information.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Perlis, Michael Lloyd. "Alpha sleep and information processing, arousal and perception of sleep in fibromyalgia." Diss., The University of Arizona, 1994. http://hdl.handle.net/10150/186882.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

This dissertation project was designed to examine the relationship between alpha sleep and information processing during sleep in persons with fibromyalgia. The study tested the following hypotheses: (1) persons with alpha sleep are more sensitive to external stimuli during sleep than non-alpha sleepers (2) alpha sleepers are more likely to identify polysomnographically defined sleep as wakefulness than non-alpha sleepers and (3) alpha sleepers are more likely to complain of shallow and non-restorative sleep than non-alpha sleepers. To assess the extent to which subjects manifested alpha sleep, subjects were allowed to sleep undisturbed for the first 60 minutes of the study. Quantitative analyses of alpha activity during this period was performed via visual assessment, signal detection and power spectral technologies. After this period elapsed, two experimental tasks were conducted to test for information processing and memory during sleep. The first task was a test of both implicit and explicit memory for auditory stimuli presented during sleep. The second task was a test of short term memory and of subjective perception of sleep during polysomnographically defined stages of sleep. It was found that alpha activity occurring during sleep in fibromyalgic patients is not associated with increased long term memory for auditory events or the myalgia symptoms of fibromyalgia, but is associated with enhanced short term memory for stimuli presented during stage 2 sleep, the tendency to identify stage 2 sleep as wakefulness, the increased tendency to arouse in relation to auditory stimuli and the perception of shallow sleep.

2

Bruschi, David Edward. "To Alpha Centauri in a box and beyond : motion in Relativistic Quantum Information." Thesis, University of Nottingham, 2012. http://eprints.nottingham.ac.uk/12786/.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

In this work we mainly focus on two main aspects of interest within the field of Relativistic Quantum Information. We first expand on the current knowledge of the effects of relativity on entanglement between global field modes. Within this aspect, we focus on two topics: we address and revise the single mode approximation commonly used in the literature. We study the nonlocal correlations of charged bosonic field modes and the degradation of entanglement initially present in maximally entangled states as a function of acceleration, when one observer is accelerated. In the second part of this work we introduce, develop and exploit a method for confining quantum fields within one (or two) cavities and analyzing the effects of motion of one cavity on the entanglement initially present between cavity field modes. One cavity is always allowed to undergo arbitrary trajectories composed of segments of inertial motion and uniform acceleration. We investigate how entanglement is degraded, conserved and created as a function of the parameters describing the motion and we provide the analytical tools to understand how these effects occur. We conclude this work by analyzing the effects of the change of spatial topology on the nonlocal correlations present in the Hawking-Unruh radiation in the topological geon analogue of black hole spacetimes.

3

Palacios, Arnold Raul. "Role of GSK-3 alpha beta in B cell proliferation during germinal center information." Thesis, Boston University, 2013. https://hdl.handle.net/2144/21229.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Thesis (M.A.) PLEASE NOTE: Boston University Libraries did not receive an Authorization To Manage form for this thesis or dissertation. It is therefore not openly accessible, though it may be available by request. If you are the author or principal advisor of this work and would like to request open access for it, please contact us at open-help@bu.edu. Thank you.
Glycogen Synthase Kinase-3αß is an enzyme that is involved in cell cycle regulation by promoting the degradation of cyclin D1 and cycling D3 in cells. Special emphasis is placed in its regulatory role in B cells, as there it is evidence that suggests that this protein is inhibited during germinal center formation, where B cells undergo proliferation, somatic hypermutation and class switch recombination. By inducing DNA recombination via the Cre/lLxP recombination system and utilizing tamoxifen as a Cre activity inducer, B cells were culture in 40LB cells to form induced germinal center in vitro. Flow cytometry analysis suggests that in the absence of GSK-3 αß B cells proliferate extensively in germinal centers and being the process of class switch recombination. Although the results of this study are in accord with current theory, more experiments and research need to be made to validate the conclusions set forth in this study.
2031-01-01

4

Cheng, Wei. "What can information guess ? : Towards information leakage quantification in side-channel analysis." Electronic Thesis or Diss., Institut polytechnique de Paris, 2021. http://www.theses.fr/2021IPPAT044.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Les algorithmes cryptographiques jouent un rôle prédominant pour établir une connectivité sécurisée dans notre société numérique actuelle. Ces calculs traitent des informations sensibles telles que des clés de chiffrement, qui sont généralement très exposées lors de la manipulation, ce qui représente une menace énorme pour la sécurité des informations sensibles dans les composants cryptographiques et l'ensemble des systèmes connectés. Dans le domaine de la sécurité des systèmes embarqués, l'analyse des canaux auxiliaires est l'une des techniques les plus puissantes contre les implémentations cryptographiques. Le sujet principal de cette thèse concerne la sécurité mesurable des canaux auxiliaires des implémentations cryptographiques, en particulier en présence de masquage aléatoire. Globalement, cette thèse se compose de deux sujets. L'un est la quantification des fuites de la forme la plus générale de masquage équipé des codes linéaires, dit masquage à base de code ; l'autre est l'exploration de l'application de mesures d'information plus génériques dans un contexte d'analyse de canaux auxiliaires. Pour ce qui concerne le premier sujet, nous proposons un cadre théorique de codage unifié pour mesurer la fuite d'informations dans le masquage basé sur les codes. Plus précisément, notre cadre établit des connexions formelles entre les propriétés de codage et les métriques de fuite dans l'analyse des canaux auxiliaires. Ces connexions formelles nous permettent de faire avancer l'évaluation quantitative sur la façon dont les codes linéaires peuvent affecter la sécurité concrète de tous les schémas de masquage basés sur les codes. Notre formalisation est finalement vérifiée par une évaluation basée sur les attaques, où les attaques utilisent des distingueurs basés sur le maximum de vraisemblance et donc optimales. Concernant le deuxième sujet, nous proposons d'utiliser une mesure plus générale du point de vue de la théorie de l'information, à savoir l’information alpha (alpha-information) d'ordre alpha. La nouvelle mesure donne également la limite supérieure du taux de succès et la limite inférieure du nombre de mesures. Ce qui est remarquable, c'est qu'avec des choix appropriés de alpha, l'information alpha fournit des bornes très proches de la réalité; en particulier, lorsque alpha tend vers l'infini (positif), les limites seront exactes. En fait, les distingueurs basés sur le maximum de vraisemblance convergeront vers les limites. Par conséquent, nous démontrons comment les deux mondes, à savoir les mesures du point de vue de la théorie de l'information (limites) et les attaques par canaux auxiliaires basées sur le maximum de vraisemblance, sont parfaitement connectés dans l'analyse par canaux auxiliaires. En résumé, notre étude dans cette thèse fait avancer l'évaluation et la consolidation de la sécurité des canaux auxiliaires des implémentations cryptographiques. Du point de vue de la protection, nous fournissons un guide des meilleures pratiques pour l’application du masquage basé sur le code. Du point de vue de l'évaluation, l'application de l'alpha-information permet aux évaluateurs et concepteurs (développeurs) d'avoir une estimation plus précise (voire exacte) du niveau de sécurité concret des canaux auxiliaires émanant de leurs puces cryptographiques
Cryptographic algorithms are nowadays prevalent in establishing secure connectivity in our digital society. Such computations handle sensitive information like encryption keys, which are usually very exposed during manipulation, resulting in a huge threat to the security of the sensitive information concealed in cryptographic components. In the field of embedded systems security, side-channel analysis is one of the most powerful techniques against cryptographic implementations. The main subject of this thesis is the measurable side-channel security of cryptographic implementations, particularly in the presence of random masking. Overall, this thesis consists of two topics. One is the leakage quantification of the most general form of masking equipped with the linear codes, so-called code-based masking; the other one is exploration of applying more generic information measures in a context of side-channel analysis. Two topics are inherently connected to each other in assessing and enhancing the practical security of cryptographic implementations .Regarding the former, we propose a unified coding-theoretic framework for measuring the information leakage in code-based masking. Specifically, our framework builds formal connections between coding properties and leakage metrics in side-channel analysis. Those formal connections enable us to push forward the quantitative evaluation on how the linear codes can affect the concrete security of all code-based masking schemes. Moreover, relying on our framework, we consolidate code-based masking by providing the optimal linear codes in the sense of maximizing the side-channel resistance of the corresponding masking scheme. Our framework is finally verified by attack-based evaluation, where the attacks utilize maximum-likelihood based distinguishers and are therefore optimal. Regarding the latter, we present a full spectrum of application of alpha-information, a generalization of (Shannon) mutual information, for assessing side-channel security. In this thesis, we propose to utilize a more general information-theoretic measure, namely alpha-information (alpha-information) of order alpha. The new measure also gives the upper bound on success rate and the lower bound on the number of measurements. More importantly, with proper choices of alpha, alpha-information provides very tight bounds, in particular, when alpha approaches to positive infinity, the bounds will be exact. As a matter of fact, maximum-likelihood based distinguishers will converge to the bounds. Therefore, we demonstrate how the two world, information-theoretic measures (bounds) and maximum-likelihood based side-channel attacks, are seamlessly connected in side-channel analysis .In summary, our study in this thesis pushes forward the evaluation and consolidation of side-channel security of cryptographic implementations. From a protection perspective, we provide a best-practice guideline for the application of code-based masking. From an evaluation perspective, the application of alpha-information enables practical evaluators and designers to have a more accurate (or even exact) estimation of concrete side-channel security level of their cryptographic chips

5

Hedin, Stenmark Olof. "Strategy Mapping : The Intended Effects of an Investment in Information Systems - A Case Study on Alpha AB." Thesis, Internationella Handelshögskolan, Högskolan i Jönköping, IHH, Företagsekonomi, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-15193.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Problem Deciding where to spend the information system investment budget with respect to strategic priorities is important for companies, but it is far from every company that does it. There is no consesus in the literature on how information system investment decisions should be assessed. This case study is conducted on Alpha AB. The company consists of several departments that each require different kind of information to support, improve and facilitate their area of operation. They are in the process of deciding where to spend their investment budgets. Purpose The Purpose of this thesis is to identify the processes that will be improved by a new information system and display how it affects the overall goal of Alpha AB, using the strategy map and the balanced scorecard. This will enable decision makers to make better decisions regarding information system investments. Theoretical Framework In order to fulfill the purpose have a review of how information systems improve companies been done, furthermore have investment issues been discussed. In order to display the effects of the intended investment will the strategy map be used, the balanced scorecard will be introduced for performance measurement. Empirical Findings The empirical findings are collected from four high level employees at Alpha AB. Through unstructured and semi-structured interviews have information about the company, its goals, customers, the internal processes and their hopes for the new information system been collected. Analysis The empirical findings have been compiled on the strategy map with the intension to show the causal relationship that the intended investment will have on the company. In order to quantify the targets on which the investment is supposed to cause its effects, have the empirical findings also been compiled on to a balanced scorecard Conclusion The strategy map and the balanced scorecard display the intended effects that the investment causes. The decision makers at Alpha AB can with a holistic view follow the causal relationship from the expectations of the investment and see which of them will be a priority to invest in.

6

Ferm, Johan. "Europa's Lyman-Alpha Shadow on Jupiter." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-285567.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Europa is one of the most interesting satellites in the solar system in the search ofextra-terrestrial life, as it harbours an interior water ocean under its icy surface. Watervapour in Europa’s atmosphere has been previously observed, suggesting water plumeeruptions from the surface. These plumes could potentially originate from the subsurfaceocean, and as such contain ocean constituents that can be examined in orbit. Twoobservations of Europa’s far-ultraviolet shadow on Jupiter were made by the HubbleSpace Telescope in 2018 and 2019. It was observed in Lyman-α (1 216 Å), a spectral lineof hydrogen. This study investigates the imaged Lyman-α shadow in search of potentialplumes at the shadow limb. Examining the shadow instead of the moon itself is a newmethod of remotely studying the Europan atmosphere. Forward modelling is applied tocreate artificial images that are compared to the observations. Any anomalies aroundthe shadow limb are then analysed and evaluated for their statistical significance. Twonoteworthy outliers are found at the limb (one on each occasion) corresponding to H2Oline of sight column densities of 3.07×1017 cm−2 and 4.72×1016 cm−2, for the 2018 and2019 observation, respectively. They are not significant however, as they lie within threestandard deviations from the expected value (< 3σ). An upper limit on what columndensity is detectable in the data is computed, yielding 6.71×1016 cm−2 (using only 2019data due to a weak signal on the 2018 occasion). A constraint on the maximum possibleH2O column density at Europa is thus provided. The new method is shown to be usefulfor the intended purpose and could potentially be applied on other icy moons.
Europa är ett av solsystemets mest intressanta objekt i jakten på utomjordiskt liv, dådet finns ett hav av vatten under månens isiga yta. Vattenånga har tidigare observeratsi Europas atmosfär, vilket kan tyda på vattenplymer som skjuts ut från ytan i kraftigautbrott. Dessa plymer kan möjligtvis ha sitt ursprung i månens inre hav, de kandärför möjliggöra en analys av havsvattnets beståndsdelar i omloppsbana. Europasultravioletta skugga på Jupiter observerades vid två tillfällen 2018 och 2019, av HubbleSpace Telescope. Observationerna gjordes i Lyman-α (1 216 Å), en spektrallinje hos väte.Denna studie undersöker den avbildade skuggan i Lyman-α för att söka efter potentiellavattenplymer vid skuggans rand. Att undersöka skuggan istället för själva månen är en nymetod för att studera Europas atmosfär genom fjärranalys. Metoden forward modellinganvänds för att skapa artificiella bilder, som jämförs med observationerna. Eventuellaavvikelser som hittas runt skuggans rand analyseras sedan och deras statistiska signifikansutvärderas. Två anmärkningsvärda avvikelser kan hittas vid randen (en vid varjeobservationstillfälle), som motsvarar H2O-kolumndensiteter på 3.07 × 1017 cm−2 och4.72 × 1016 cm−2, för 2018-observationen respektive 2019-observationen. Densiteternaär dock inte signifikanta, då de ligger inom tre standardavvikelser från deras förväntadevärden (< 3σ). Istället beräknas en övre gräns för vilken kolumndensitet som kandetekteras i datan, vilket ger 6.71 × 1016 cm−2 (där endast 2019-data används på grundav en svag signal hos 2018-observationen). Den högsta möjliga H2O-kolumndensitetenkan således begränsas. Den nya metoden visar sig vara användbar för det tänkta syftetoch kan eventuellt appliceras på andra ismånar.

7

Eng, Stefan. "Heuristisk profilbaserad optimering av instruktionscache i en online Just-In-Time kompilator." Thesis, Linköping University, Department of Computer and Information Science, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2452.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

This master’s thesis examines the possibility to heuristically optimise instruction cache performance in a Just-In-Time (JIT) compiler.

Programs that do not fit inside the cache all at once may suffer from cache misses as a result of frequently executed code segments competing for the same cache lines. A new heuristic algorithm LHCPA was created to place frequently executed code segments to avoid cache conflicts between them, reducing the overall cache misses and reducing the performance bottlenecks. Set-associative caches are taken into consideration and not only direct mapped caches.

In Ahead-Of-Time compilers (AOT), the problem with frequent cache misses is often avoided by using call graphs derived from profiling and more or less complex algorithms to estimate the performance for different placements approaches. This often results in heavy computation during compilation which is not accepted in a JIT compiler.

A case study is presented on an Alpha processor and an at Ericsson developed JIT Compiler. The results of the case study shows that cache performance can be improved using this technique but also that a lot of other factors influence the result of the cache performance. Such examples are whether the cache is set-associative or not; and especially the size of the cache highly influence the cache performance.

8

Jazayeri, Jahangir. "Ly-α Dayglow on Uranus : Radiative Transfer Modelling." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-292992.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Uranus is one of the least explored planets in our solar system. Event though the Uranian Ly-α emission has been a subject of study for decades, there is not a consensus on the sources contribution to the total signal. This thesis aims to analyse the contribution from scattering of the solar flux to the Uranian Ly-α dayglow by solving the radiative transfer equation in a parameter study for the atmosphere. The sources are solar Ly-α resonant scattering and Rayleigh scattering by atomic and molecular hydrogen respectively. The radiative transfer equation is solved using the Feautrier Method Program, a program written by Randall G. Gladstone. The program was adjusted to the Uranian atmosphere and modelled with different variations in parameters, including the atmospheric temperature and particle density of Ly-α scatterers and absorbers. A parameter study is performed to investigate the dependency on the Ly-α signal on these parameters. The results showed a significant Ly-α limb brightening with a maximum intensity located around 400 km outside one planetary radius as seen from the disk center. The contributions to the Ly-α dayglow from Rayleigh scattering by H2 was calculated to be 160 R whereas the contribution from resonant scattering by H was 550 R. One feature that prevents direct comparison to observed data with this thesis is that some sources that contributes to Uranus Ly-α signal are omitted in the simulations.
Uranus är en av solsystemets minst utforskade planeter. Även om dess Ly-α-strålning har undersökt, råder ännu inte konsensus kring de olika källornas bidrag till den totala Ly-α signalen. Genom att lösa ekvationen för strålningstransport i en parameterstudie ämnar examensarbetet att studera bidraget från solens två källor till Uranus Ly-α- signal. De två källorna är resonant- och Rayleigh strålningsspridning från atomärt och molekulärt väte. Ekvationen för strålningstransport beräknas av ett program som heter Feautrier Method Program, skapat av Randall G. Gladstone. Programmet har justeras till Uranus atmosfär för att kunna beräkna strålningstransport för olika atmosfärersmodeller i en parameterstudie. Parameterna som ändras är temperaturen, partikeldensiteten hos spridare och absorberare i atmosfären. Från resultaten kan parameterstudien svara på beroendet av de olika källorna till Ly-α-signalen från Uranus. Resultaten visar en tydlig ökning av ljusintensitet vid Uranus kanter med maximum runt 400 km utanför planetens radie, sett från planetens mitt. Bidraget till Ly-α-signalen från Rayleigh stålningsspridning beräknades till 160 R och från resonant strålningsspridning till 550 R. En egenskap som hindrar direkt jämförelse med resultaten från detta examensarbete och observerad data är att alla bidragande källor till Uranus Ly-α signal inte simulerats.

9

Akintola, Abayomi Rasheed. "User Adoption of Big Data Analyticsin the Public Sector." Thesis, Linnéuniversitetet, Institutionen för informatik (IK), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-86641.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

The goal of this thesis was to investigate the factors that influence the adoption of big data analytics by public sector employees based on the adapted Unified Theory of Acceptance and Use of Technology (UTAUT) model. A mixed method of survey and interviews were used to collect data from employees of a Canadian provincial government ministry. The results show that performance expectancy and facilitating conditions have significant positive effects on the adoption intention of big data analytics, while effort expectancy has a significant negative effect on the adoption intention of big data analytics. The result shows that social influence does not have a significant effect on adoption intention. In terms of moderating variables, the results show that gender moderates the effects of effort expectancy, social influence and facilitating condition; data experience moderates the effects of performance expectancy, effort expectancy and facilitating condition; and leadership moderates the effect of social influence. The moderation effects of age on performance expectancy, effort expectancy is significant for only employees in the 40 to 49 age group while the moderation effects of age on social influence is significant for employees that are 40 years and more. Based on the results, implications for public sector organizations planning to implement big data analytics were discussed and suggestions for further research were made. This research contributes to existing studies on the user adoption of big data analytics.

10

Carlsson, Fredrik, and Joey Öhman. "AlphaZero to Alpha Hero : A pre-study on Additional Tree Sampling within Self-Play Reinforcement Learning." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-259200.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

In self-play reinforcement learning an agent plays games against itself and with the help of hindsight and retrospection improves its policy over time. Using this premise, AlphaZero famously managed to become the strongest known Go, Shogi, and Chess entity by training a deep neural network from data collected solely from self-play. AlphaZero couples this deep neural network with a Monte Carlo Tree Search algorithm that drastically improves the networks initial policy and state evaluation. When training AlphaZero relies on the final outcome of the game for the generation of training labels. By altering the learning target to instead make use of the improved state evaluation acquired after the tree search, the creation of training labels for states exclusively visited by tree search becomes possible. We propose the extension of Additional Tree Sampling that exploits the change of learning target and provide theoretical arguments and counterarguments for the validity of this approach. Further, an empirical analysis is performed on the game Connect Four, which harbors results that justifies the change in learning target. The altered learning target seems to have no negative impact on the final player strength nor on the behavior of the learning algorithm over time. Based on these positive results we encourage further research of Additional Tree Sampling in order to validify or reject the usefulness of this method.
I självspelande straffinlärning spelar en agent mot sig själv. Med hjälp av sofistikerade algoritmer och tillbakablickande kan agenten lära sig en bra policy över tid. Denna metod har gjort AlphaZero till världens starkaste spelare i Go, Shogi, och Schack genom att träna ett djupt neuralt nätverk med data samlat enbart från självspel. AlphaZero kombinerar detta djupa neurala nätverk med en Monte Carlo Tree Search-algoritm som kraftigt förstärker nätverkets evaluering av ett bräde. Originalversionen av AlphaZero genererar träningsdata med det slu*tgiltiga resultatet av ett spel som inlärningsmål. Genom att ändra detta inlärningsmål till resultatet av trädsöket istället, möjliggörs skapandet av träningsdata från bräden som enbart blivit upptäckta genom trädsök. Vi föreslår en utökning, Additional Tree Samling, som utnyttjar denna förändring av inlärningsmål. Detta följs av teoretiska argument för och emot denna utökning av AlphaZero. Vidare utförs en empirisk analys på spelet Fyra i Rad som styrker faktumet att modifieringen av inlärningsmål är rimligt. Det förändrade inlärningsmålet visar inga tecken på att försämra den slu*tgiltiga spelarens skicklighet eller inlärningsalgoritmens beteende under träning. Vi uppmuntrar, baserat på dessa positiva resultat, ytterligare forskning vad gäller Additional Tree Sampling, för att se huruvida denna metod skulle förändra AlphaZero.

11

Lehtilä, Leo. "Implementation and characterization of Silicon detectors for studies on neutron-induced nuclear reactions." Thesis, Uppsala universitet, Tillämpad kärnfysik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-389466.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Energy resolution characteristics of silicon surface barrier detector signals amplified by different preamplifiers and spectroscopic amplifiers have been studied. The characterization has been done using alpha particles from an 241Am source and spontaneous fission fragments from two Cf sources. The alpha and spontaneous fission activities of the sources have been measured and the isotopic compositions, ages, and initial activities of the two Cf sources have been calculated using the results from the activity measurements. 82.3% and 82.5% of the spontaneous fission activity of the two sources is found to originate from 252Cf. Heavy ion detection properties of two Si detector setups have been determined by measuring spontaneous fission fragments from one of the Cf sources in coincidence. The mass distribution of fission fragments is derived from the pulse spectra of the coincidence measurements. The conditions for future time resolution measurements have been established. Inquiries on commercially available ultra-thin Si detectors have been made. The purpose is to upgrade detector telescopes to lower the energy threshold of ΔE-ΔE-E identification of particles from neutroninduced nuclear reactions. Three manufacturers of Si detectors with thickness 20-25 µm and active area around 450 mm2 have been listed together with properties of the three offered detectors.

12

Joret, Gwenaël. "Entropy and stability in graphs." Doctoral thesis, Universite Libre de Bruxelles, 2007. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210605.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Un stable (ou ensemble indépendant) est un ensemble de sommets qui sont deux à deux non adjacents. De nombreux résultats classiques en optimisation combinatoire portent sur le nombre de stabilité (défini comme la plus grande taille d'un stable), et les stables se classent certainement parmi les structures les plus simples et fondamentales en théorie des graphes.

La thèse est divisée en deux parties, toutes deux liées à la notion de stables dans un graphe. Dans la première partie, nous étudions un problème de coloration de graphes, c'est à dire de partition en stables, où le but est de minimiser l'entropie de la partition. C'est une variante du problème classique de minimiser le nombre de couleurs utilisées. Nous considérons aussi une généralisation du problème aux couvertures d'ensembles. Ces deux problèmes sont appelés respectivement minimum entropy coloring et minimum entropy set cover, et sont motivés par diverses applications en théorie de l'information et en bioinformatique. Nous obtenons entre autres une caractérisation précise de la complexité de minimum entropy set cover :le problème peut être approximé à une constante lg e (environ 1.44) près, et il est NP-difficile de faire strictement mieux. Des résultats analogues sont prouvés concernant la complexité de minimum entropy coloring.

Dans la deuxième partie de la thèse, nous considérons les graphes dont le nombre de stabilité augmente dès qu'une arête est enlevée. Ces graphes sont dit être "alpha-critiques", et jouent un rôle important dans de nombreux domaines, comme la théorie extrémale des graphes ou la combinatoire polyédrique. Nous revisitons d'une part la théorie des graphes alpha-critiques, donnant à cette occasion de nouvelles démonstrations plus simples pour certains théorèmes centraux. D'autre part, nous étudions certaines facettes du polytope des ordres totaux qui peuvent être vues comme une généralisation de la notion de graphe alpha-critique. Nous étendons de nombreux résultats de la théorie des graphes alpha-critiques à cette famille de facettes.


Doctorat en Sciences
info:eu-repo/semantics/nonPublished

13

Stachel,RichardD. "The impact of affective computing in raising awareness of Subjective Well-Being and its influence on adherence and Quality of Life| An experience among patients suffering from Alpha-1 Antitrypsin Deficiency-Associated COPD." Thesis, Robert Morris University, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10148302.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Nearly half of all Americans are living with a chronic condition, and one in four have Multiple Chronic Conditions (MCC). Chronic Obstructive Pulmonary Disease (COPD) is one such chronic disorder. Affecting 15 million Americans, COPD is the third leading cause of death in the United States, and treatment for COPD costs the healthcare system in excess of $32 billion annually. One significant factor leading to the high cost of care is non-adherence to medication and lifestyle recommendations. In an effort to keep chronic-disease sufferers healthier longer, thereby ameliorating the issue of avoidable costs, public health policy leaders, researchers, healthcare systems, and clinicians are attempting to discover ways to help individuals maintain appropriate adherence levels to medication prescriptions and lifestyle recommendations. In addition, previous research found connections between positivity and improved health and health outcomes. This study investigated one potential method of helping chronic-disease sufferers stay adherent and maintain their overall health. This research studied a mobile and on-line affective-computing tool that utilized Ecological Momentary Assessment (EMA) in an effort to raise the cognitive awareness of users’ subjective well-being and positivity. The study’s objective was to determine if use of this tool led to improvement in adherence, Subjective Well-Being (SWB), positivity, and overall Quality of Life (QoL). This study used an embedded mixed methods approach and involved 96 respondents diagnosed with Alpha-1 Antitrypsin Deficiency-Associated (AATD) COPD. Alpha-1 Antitrypsin Deficiency is a rare, genetic disease affecting approximately 100,000 Americans. The most significant complications for those suffering with Alpha-1 are lung or liver diseases. This study included only those diagnosed with lung disease. Participants used the affective-computing tool over a two-month period. The research measured their levels of positivity and quality of life prior to their use of the system and subsequently following it. This study also measured participants’ use of the affective-computing tool including frequency of response to push messages and response times. It then compared these variables for users who engaged with the system through email as opposed to those who participated by text messaging or Short Message Service (SMS).

Results indicated a small but insignificant increase in adherence rates, as well as improved but insignificant QoL scores between the pre and posttest periods. However, the analyses indicated a significant increase in subjective well-being scores between the two periods. They also revealed a 91.3% average compliance rate to the study push messages over the two-month period. While the research revealed faster compliance for those using text messaging, there was no significant difference in compliance rates for those answering using text messaging compared to those using email.

While the results indicated that the use of an EMA-associated system designed to raise awareness of SWB is one way of improving overall well-being and health of chronically-ill individuals, they did more significantly reveal areas of further study among other disease states, over longer study periods, and with larger sample sizes.

14

Rosenquist, Emil. "Hur presterar ett artificiellt neuralt nätverk gentemot sökalgoritmen alpha-beta pruning i spelet Othello? : Jämförelse av ANN system och ABP system på spelet Othello." Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-17011.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Deterministiska turbaserad tvåspelarspel är ett område som används inom AI forskning för att jämföra AI system. Detta arbete fokuserar på att jämföra teknikerna artificiell neuralt nätverk och alpha-beta pruning i spelet othello. Arbetet undersökte hur dessa tekniker presterar i relation till beräkningstiden. Othello positionerna representeras i en 8 x 8 matris som teknikerna använder för att hitta det optimala draget. Systemen värderades enligt en definierat metod som använder ett befintlig AI system för othello Edax. De testades på 154 othello partier med 77 stycken förbestämda startpositioner. Nätverket tränades med inlärningsdata som bestod av drag från professionella othello matcher och från Edax. Resultatet visade att ABP systemen värderades linjärt mot exponentiell beräkningstid medans ANN systemen värderades konstant mot linjär beräkningstid. Resultatet av ANN systemen tyder på att inlärningsdatan är bristande. Framtida arbete bör använda mer och bättre inlärningsdata.

15

Öberg, Viktor. "EVOLUTIONARY AI IN BOARD GAMES : An evaluation of the performance of an evolutionary algorithm in two perfect information board games with low branching factor." Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-11175.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

It is well known that the branching factor of a computer based board game has an effect on how long a searching AI algorithm takes to search through the game tree of the game. Something that is not as known is that the branching factor may have an additional effect for certain types of AI algorithms. The aim of this work is to evaluate if the win rate of an evolutionary AI algorithm is affected by the branching factor of the board game it is applied to. To do that, an experiment is performed where an evolutionary algorithm known as “Genetic Minimax” is evaluated for the two low branching factor board games Othello and Gomoku (Gomoku is also known as 5 in a row). The performance here is defined as how many times the algorithm manages to win against another algorithm. The results from this experiment showed both some promising data, and some data which could not be as easily interpreted. For the game Othello the hypothesis about this particular evolutionary algorithm appears to be valid, while for the game Gomoku the results were somewhat inconclusive. For the game Othello the performance of the genetic minimax algorithm was comparable to the alpha-beta algorithm it played against up to and including depth 4 in the game tree. After that however, the performance started to decline more and more the deeper the algorithms searched. The branching factor of the game may be an indirect cause of this behaviour, due to the fact that as the depth increases, the search space increases proportionally to the branching factor. This increase in the search space due to the increased depth, in combination with the settings used by the genetic minimax algorithm, may have been the cause of the performance decline after that point.

16

Carlsson, Matthias. "Development and Characterization of Parallel-Plate Avalanche Counters for Nuclear Physics Experiments." Thesis, Uppsala universitet, Tillämpad kärnfysik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-354818.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Parallel-plate avalanche counters, PPACs, are commonly used to detect fission fragments. The PPAC detects them and mark (very accurately) the time of detection. Such measurements can be used to measure the neutron energy (via time-of-flight) to study neutron-induced fission.This project report provides a method that, together with the discussed improvements, allows the fabrication of good quality PPAC detectors. Several PPACs are manufactured and the electrodes are built from 0.9 µm thick mylar foils which are evaporated with a 40-80 nm thin layer of aluminum.The developed PPACs are characterized with well known radioactive Cf and Am sources (the source characterization also found in this report), and compared against each other. Additionally, the PPAC signal amplitude spectrum are found to follow theoretical expectations with regards to angular dependence, gas pressure and an applied electrode voltage.At a specific applied electrode voltage and range of gas pressures (3-9 mbar), the measured time resolutions are 2.24-1.38 ns. A trend is observed for finer time resolutions at higher gas pressures.
Parallel-plate avalanche counters, PPACs, används ofta för att detektera fissionsfragment. PPAC:en detekterar fragmenten med väldigt god tidsupplösning och således kan PPAC detektorer användas till att mäta neutron energier (mha. flygtidsmetoden), vilka uppmätts för att studera neutroninducerad fission.Det här projektet och den här rapporten beskriver en metod, med föreslagna förbättringar, som möjliggör tillverkning av PPAC detektorer av bra kvalitet. Under projektet har flera PPACs byggts med elektroder gjorda av 0.9 µm tunn mylar förångade med 40-80 nm aluminium. De tillverkade PPAC detektorerna är karaktäriserade med väl kända radioaktiva Cf- och Am-källor (dessa karaktäriseras även i den här rapporten). Detektorerna är sedan jämförda mot varandra och är funna att följa teoretiska förväntningar med avseende på vinkel-, gastryck- och pålagd elektrodspänningsberoende.Resultaten av projektet, som besvarar flera tidigare frågeställningar och bekräftar vissa antaganden, flyttar utsikten och förståelsen framåt för hur PPACs fungerar och vad forskarna kan uppnå med dem.

17

Curado, Manuel. "Structural Similarity: Applications to Object Recognition and Clustering." Doctoral thesis, Universidad de Alicante, 2018. http://hdl.handle.net/10045/98110.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

In this thesis, we propose many developments in the context of Structural Similarity. We address both node (local) similarity and graph (global) similarity. Concerning node similarity, we focus on improving the diffusive process leading to compute this similarity (e.g. Commute Times) by means of modifying or rewiring the structure of the graph (Graph Densification), although some advances in Laplacian-based ranking are also included in this document. Graph Densification is a particular case of what we call graph rewiring, i.e. a novel field (similar to image processing) where input graphs are rewired to be better conditioned for the subsequent pattern recognition tasks (e.g. clustering). In the thesis, we contribute with an scalable an effective method driven by Dirichlet processes. We propose both a completely unsupervised and a semi-supervised approach for Dirichlet densification. We also contribute with new random walkers (Return Random Walks) that are useful structural filters as well as asymmetry detectors in directed brain networks used to make early predictions of Alzheimer's disease (AD). Graph similarity is addressed by means of designing structural information channels as a means of measuring the Mutual Information between graphs. To this end, we first embed the graphs by means of Commute Times. Commute times embeddings have good properties for Delaunay triangulations (the typical representation for Graph Matching in computer vision). This means that these embeddings can act as encoders in the channel as well as decoders (since they are invertible). Consequently, structural noise can be modelled by the deformation introduced in one of the manifolds to fit the other one. This methodology leads to a very high discriminative similarity measure, since the Mutual Information is measured on the manifolds (vectorial domain) through copulas and bypass entropy estimators. This is consistent with the methodology of decoupling the measurement of graph similarity in two steps: a) linearizing the Quadratic Assignment Problem (QAP) by means of the embedding trick, and b) measuring similarity in vector spaces. The QAP problem is also investigated in this thesis. More precisely, we analyze the behaviour of $m$-best Graph Matching methods. These methods usually start by a couple of best solutions and then expand locally the search space by excluding previous clamped variables. The next variable to clamp is usually selected randomly, but we show that this reduces the performance when structural noise arises (outliers). Alternatively, we propose several heuristics for spanning the search space and evaluate all of them, showing that they are usually better than random selection. These heuristics are particularly interesting because they exploit the structure of the affinity matrix. Efficiency is improved as well. Concerning the application domains explored in this thesis we focus on object recognition (graph similarity), clustering (rewiring), compression/decompression of graphs (links with Extremal Graph Theory), 3D shape simplification (sparsification) and early prediction of AD.
Ministerio de Economía, Industria y Competitividad (Referencia TIN2012-32839 BES-2013-064482)

18

Wang, Chien-Chu, and 王千竹. "Distributed Information Fusion via Federated Alpha-Beta-Gamma Filter." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/06062389336328517990.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

碩士
育達商業技術學院
資訊管理所
96
A federated alpha-beta-gamma filter is developed for utilization in multi-sensor systems tracking a maneuvering target over the certain area. Filter architecture that consists of sensors, local processors and global processor is employed to describe the distributed fusion problem. The sensor filtering algorithm utilized in the Reference Cartesian Coordinate System is presented for target tracking when the sensor measures range, bearing, and elevation angle in the Spherical Coordinate System. Each local processor uses decoupling technique to develop the tracking index to obtain the alpha-beta-gamma filter gain and the corresponding covariance formulations that are recursively computed in the Line-of-Sight Cartesian Coordinate System and then transformed for use in the Reference Cartesian Coordinate system. Common process noise correlations are handled by the factor which is selected by a conservative matrix upper bound. The global processor combines local processor outputs via weighted least square estimator. The resulting filter has computational advantage over traditional maximum likelihood estimator. The results of computer simulations are presented for the performance comparison of proposed filter, traditional maximum likelihood estimator, and covariance matching method. With comparing the reference values of the sensor-level, the Averaged Root Mean Square Error (ARMSE) of position, velocity, and acceleration were found about 77.08%, 60.10%, and 32.31% improved. Also, the performance indexes of position, velocity, and acceleration with the local-processor were found to be larger (about 47.91%, 30.33%, and 12.71%) than with the global-processor, respectively.

19

Pan, Yen-Shao, and 潘彥卲. "Gaining alpha from index market by using the implied information of stock options." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/07334526595928653252.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

碩士
國立臺灣大學
財務金融學研究所
104
The article light up a way that we can gain the indicators extracting useful information by intuitive function to predict index market. We use these indicators to construct some strategies which can gain return well. Because of the low frequency of changing position, our strategies enjoy the lower transaction cost. The more profitability and less ruin risk will give large stock position of insurance more safer than before, so we think insurance will be interesting to our strategies.

20

Lee, Pei-sang, and 李佩桑. "A Multi-Factor Alpha Model Constructed Using Multi-lag-period Information— with Application in the Taiwan Market." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/jw93dz.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

碩士
國立中山大學
財務管理學系研究所
101
The main objective of this study is to generate values by combining the current and prior values of descriptors to improve the performance of a portfolio constructed based on the standard alpha model of Hsu et al. (2011). The Polynomial Distributed Lag Model, a time-series model, is adopted to detect the optimal lag length of each company in our research. After measuring an “adequate” lag length for each descriptor, we use the approach of exponential smoothing to combine the current and multi-lag-period descriptors. Instead of using the subjective method applied inHsu et al. (2011), our study calculates some statistics to filter the valid descriptors.The empirical results suggest that the new values of the monthly and weekly frequency descriptors should substitute the original ones, especially those within the Value factor. When compared with a portfolio constructed using the raw descriptor values, the IR of the portfolio with the new values of monthly and weekly descriptors is increased from 0.203 to 0.612. Although its tracking error rises slightly by 0.28%, this portfolio still achieves the requirement of an enhanced index fund, which is below 3%.

21

Yang, Jen-Hung, and 楊仁宏. "Perspectives from Different Levels of Management Teams on Business and Information Systems Strategies Alignment – a Case of Alpha Networks Inc." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/13176005472686671754.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

碩士
國立交通大學
管理學院碩士在職專班資訊管理組
95
When formulating their business strategies, companies are influenced by both the internal and external environments. As a result of many significant advances and changes in the field of information technology and networking, the role of an enterprise’s information system has changed from being a traditional computing tool to an important factor that influences the enterprise’s strategic development. Therefore, the strategic alignment between an organization’s business strategy and its information systems is an important topic. Although strategic alignment has been studied extensively for more that two decades, most research has focused are on the strategic alignment between business strategy and information systems (Hambrick, 1981; Sabherwal and Chan, 1997; Luftman, 1993). For example, in 1993, IBM Systems Journal published a special series on the strategic alignment between business strategy and information systems. By comparison, few works have studied the relationship between business strategy and IS alignment from the perspective of different levels of management. This thesis takes Alpha Networks Inc as the subject of a case study. We use questionnaires and interviews to analyze the perspectives of thirty-three managers from different departments on the relationship or alignment between the company’s business strategy and its IS strategy. The findings show that the higher an executive’s level, the greater the alignment perceived between the company’s business and IS strategies.

22

Narang, Pooja. "Computational pathway for bracketing native like tertiary structures from sequence and secondary structural information of small alpha helical globular proteins." Thesis, 2005. http://localhost:8080/iit/handle/2074/3641.

Full text

APA, Harvard, Vancouver, ISO, and other styles

23

Fry, John, J.-P. Serbera, and R.J.Wilson. "Managing performance expectations in association football." 2021. http://hdl.handle.net/10454/18586.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

Yes
Motivated by excessive managerial pressure and sackings, together with associated questions over the inefficient use of scarce resources, we explore realisticperformance expectations in association football. Our aim is to improve management quality by accounting for information asymmetry. Results highlightuncertainty caused both by football’s low-scoring nature and the intensity ofthe competition. At a deeper level we show that fans and journalists are proneto under-estimate uncertainties associated with individual matches. Further, wequantify reasonable expectations in the face of unevenly distributed resources.In line with the statactivist approach we call for more rounded assessments to bemade once the underlying uncertainties are adequately accounted for. Managing fan expectations is probably impossible though the potential for constructivedialogue remains.
The full-text of this article will be released for public view at the end of the publisher embargo on 5th Jan 2023.

24

Paradinovic, Ivana. "Comparison of the performance of Islamic, Sri and green mutual funds." Master's thesis, 2017. http://hdl.handle.net/10362/25469.

Full text

APA, Harvard, Vancouver, ISO, and other styles

Abstract:

This paper measures and compares performance of Islamic, SRI and Green mutual equity funds worldwide in the period from 1 January 2001 to 31 December 2015. The sample consists of 611 mutual equity funds and their performance was assessed by using traditional risk-adjusted measures, namely Sharpe ratio, Modified Sharpe Ratio, Adjusted Sharpe Ratio, Treynor measure, Information ratio and Jensen’s alpha. The main findings show that Green mutual equity funds, on average, outperform both SRI and Islamic mutual equity funds over the entire observed period. SRI and Islamic mutual equity funds show similar performance, with a slight outperformance of SRI mutual fund for the majority of measures. Omission of the financial crisis of 2007 – 2008 and dot.com crisis in 2001 observations from the sample period reduces the differences in the performance between SRI and Islamic mutual equity funds. The Green mutual equity funds still remain the best performing ones. While there is an economic significance as presented in this thesis, there is no statistical significance as can be seen from t-test results.

You might also be interested in the bibliographies on the topic 'Alpha-information' for other source types:

Journal articles

To the bibliography
Dissertations / Theses: 'Alpha-information' – Grafiati (2024)
Top Articles
Latest Posts
Article information

Author: Rev. Leonie Wyman

Last Updated:

Views: 6292

Rating: 4.9 / 5 (59 voted)

Reviews: 90% of readers found this page helpful

Author information

Name: Rev. Leonie Wyman

Birthday: 1993-07-01

Address: Suite 763 6272 Lang Bypass, New Xochitlport, VT 72704-3308

Phone: +22014484519944

Job: Banking Officer

Hobby: Sailing, Gaming, Basketball, Calligraphy, Mycology, Astronomy, Juggling

Introduction: My name is Rev. Leonie Wyman, I am a colorful, tasty, splendid, fair, witty, gorgeous, splendid person who loves writing and wants to share my knowledge and understanding with you.