Journal articles on the topic 'Large-Page'

To see the other types of publications on this topic, follow the link: Large-Page.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Large-Page.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Kessler, R. E., and Mark D. Hill. "Page placement algorithms for large real-indexed caches." ACM Transactions on Computer Systems 10, no. 4 (November 1992): 338–59. http://dx.doi.org/10.1145/138873.138876.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Spencer, S., and L. Cram. "Large-scale magnetic fields in spiral galaxies [page missing]." Australian Journal of Physics 46, no. 1 (1993): 195. http://dx.doi.org/10.1071/ph930195.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cha, G. H. "The Segment-Page Indexing Method for Large Multidimensional Queries." Computer Journal 48, no. 1 (January 1, 2005): 101–14. http://dx.doi.org/10.1093/comjnl/bxh057.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Matera, Robert, Katalin V. Horvath, Hari Nair, Ernst J. Schaefer, and Bela F. Asztalos. "HDL Particle Measurement: Comparison of 5 Methods." Clinical Chemistry 64, no. 3 (March 1, 2018): 492–500. http://dx.doi.org/10.1373/clinchem.2017.277632.

Full text
Abstract:
Abstract BACKGROUND HDL cell cholesterol efflux capacity has been documented as superior to HDL cholesterol (HDL-C) in predicting cardiovascular disease risk. HDL functions relate to its composition. Compositional assays are easier to perform and standardize than functional tests and are more practical for routine testing. Our goal was to compare measurements of HDL particles by 5 different separation methods. METHODS HDL subfractions were measured in 98 samples using vertical auto profiling (VAP), ion mobility (IM), nuclear magnetic resonance (NMR), native 2-dimensional gel electrophoresis (2D-PAGE), and pre-β1-ELISA. VAP measured cholesterol in large HDL2 and small HDL3; IM measured particle number directly in large, intermediate, and small HDL particles; NMR measured lipid signals in large, medium, and small HDL; 2D-PAGE measured apolipoprotein (apo) A-I in large (α1), medium (α2), small (α3–4), and pre-β1 HDL particles; and ELISA measured apoA-I in pre-β1-HDL. The data were normalized and compared using Passing–Bablok, Lin concordance, and Bland–Altman plot analyses. RESULTS With decreasing HDL-C concentration, NMR measured a gradually lower percentage of large HDL, compared with IM, VAP, and 2D-PAGE. In the lowest HDL-C tertile, NMR measured 8% of large HDL, compared with IM, 22%; VAP, 20%; and 2D-PAGE, 18%. There was strong discordance between 2D-PAGE and NMR in measuring medium HDL (R2 = 0.356; rc = 0.042) and small HDL (R2 = 0.376; rc = 0.040). The 2D-PAGE assay measured a significantly higher apoA-I concentration in pre-β1-HDL than the pre-β1-ELISA (9.8 vs 1.6 mg/dL; R2 = 0.246; rc = 0.130). CONCLUSIONS NMR agreed poorly with the other methods in measuring large HDL, particularly in low HDL-C individuals. Similarly, there was strong discordance in pre-β1-HDL measurements between the ELISA and 2D-PAGE assays.
APA, Harvard, Vancouver, ISO, and other styles
5

田, 郸郸. "Large Scale Web Page Classification Algorithm Based on Spectral Hashing." Software Engineering and Applications 05, no. 01 (2016): 65–74. http://dx.doi.org/10.12677/sea.2016.51008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lei, Xiang Xin, Shun Liang Cao, Shao Yin Huang, and Jian Guo Yang. "PTList: Mining XML Data Stream Using Paging Schema." Advanced Materials Research 403-408 (November 2011): 1888–91. http://dx.doi.org/10.4028/www.scientific.net/amr.403-408.1888.

Full text
Abstract:
Aiming at unlimited growing XML data stream and large XML document, we present PTList, mining frequent subtrees in XML using paging schema. PTList pages XML data stream, manages cross-page nodes and frequent candidate subtrees growing across page, mines frequent subtrees page-by-page, selects frequent subtree according to the page minimum support, and prunes branches based on the decaying factor. PTList mines XML data stream in the limit of the error of support, improves the memory utilization, and speeds up the mining process.
APA, Harvard, Vancouver, ISO, and other styles
7

Kinoshita, Eiji, Emiko Kinoshita-Kikuta, and Tohru Koike. "Separation and detection of large phosphoproteins using Phos-tag SDS-PAGE." Nature Protocols 4, no. 10 (September 24, 2009): 1513–21. http://dx.doi.org/10.1038/nprot.2009.154.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pokharel, Bhesh Raj. "Large Hadron Collider: The Quest of 'God Particles'." Himalayan Physics 1 (July 28, 2011): 96. http://dx.doi.org/10.3126/hj.v1i0.5192.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Yoshizawa, Y., and T. Arai. "Adaptive storage control for page frame supply in large scale computer systems." ACM SIGMETRICS Performance Evaluation Review 16, no. 1 (May 1988): 235–43. http://dx.doi.org/10.1145/1007771.55622.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Murakami, Ken-Ichiro, Takaaki Matsumoto, Masahiro Kurata, and Masayoshi Nakao. "A scheduler based on the demand page-stealing for large-scale programs." Systems and Computers in Japan 17, no. 8 (1986): 31–40. http://dx.doi.org/10.1002/scj.4690170804.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

OSKIN, MARK, DIANA KEEN, JUSTIN HENSLEY, LUCIAN-VLAD LITA, and FREDERIC T. CHONG. "OPERATING SYSTEMS TECHNIQUES FOR PARALLEL COMPUTATION IN INTELLIGENT MEMORY." Parallel Processing Letters 12, no. 03n04 (September 2002): 311–26. http://dx.doi.org/10.1142/s0129626402001014.

Full text
Abstract:
Advances in DRAM density have led to several proposals to perform computation in memory [1] [2] [3]. Active Pages is a page-based model of intelligent memory that can exploit large amounts of parallel computation in data-intensive applications. With a simple VLIW processor embedded near each page on DRAM, Active Page memory systems achieve up to 1000X speedups over conventional memory systems [4]. Active Pages are specifically designed to support virtualized hardware resources. In this study, we examine operating system techniques that allow Active Page memories to share, or multiplex, embedded VLIW processors across multiple physical Active Pages. We explore the trade-off between individual page-processor performance and page-level multiplexing. We find that hardware costs of computational logic can be reduced from 31% of DRAM chip area to 12%, through multiplexing, without significant loss in performance. Furthermore, manufacturing defects that disable up to 50% of the page processors can be tolerated through efficient resource allocation and associative multiplexing.
APA, Harvard, Vancouver, ISO, and other styles
12

Bai, Xiao Jun, and Fu Dan Wu. "Research on Web Page Staticize Technology in E-Commerce System." Applied Mechanics and Materials 427-429 (September 2013): 2179–83. http://dx.doi.org/10.4028/www.scientific.net/amm.427-429.2179.

Full text
Abstract:
Dynamic webpage technology is widely used in web development, but for a large e-commerce system, it will bring huge workload to the server if access database frequently and create web page dynamically, thus the server may breakdown on this condition. In this paper, the author introduced the techniques of web page staticize, based on the design of an E-shop system, puts forward the principle, strategy and method for staticizing dynamic page, and verified the effectiveness of this solution by experiments.
APA, Harvard, Vancouver, ISO, and other styles
13

Moyer, Margaret B., and John C. Moyer. "Ensuring That Practice Makes Perfect: Implications for Children with Learning Disabilities." Arithmetic Teacher 33, no. 1 (September 1985): 40–42. http://dx.doi.org/10.5951/at.33.1.0040.

Full text
Abstract:
Many teacher use the adage “practice makes perfect” to justify large quantities of drill-and-practice activities. Textbook authors use similar reasoning when they provide page after page of the same kind of problems. In the process, the quantitative aspect of practice is overemphasized while the quality of the practice activities suffers.
APA, Harvard, Vancouver, ISO, and other styles
14

Arase, Yuki, Takahiro Hara, Toshiaki Uemukai, and Shojiro Nishio. "Annotation and Auto-Scrolling for Web Page Overview in Mobile Web Browsing." International Journal of Handheld Computing Research 1, no. 4 (October 2010): 63–80. http://dx.doi.org/10.4018/jhcr.2010100104.

Full text
Abstract:
Due to advances in mobile phones, mobile Web browsing has become increasingly popular. In this regard, small screens and poor input capabilities of mobile phones prevent users from comfortably browsing Web pages that are designed for desktop PCs. One of the serious problems of mobile Web browsing is that users often get lost in a Web page and can only view a small portion of a Web page at a time, not able to grasp the entire page’s structure to decide which direction their information of interest is located. To solve this problem, an effective technique is to present an overview of the page. Although prior studies adopted the conventional style of overview, that is, a scaled-down image of the page, this is not sufficient because users cannot see details of the contents. Therefore, in this paper, the authors present annotations on a Web page that provides a functionality which automatically scrolls the page. Results of a user experiment show that annotations are informative for users who want to find contents from a large Web page.
APA, Harvard, Vancouver, ISO, and other styles
15

Cardenas, Benjamin T., Gary Kocurek, David Mohrig, Travis Swanson, Cory M. Hughes, and Sarah C. Brothers. "Preservation of Autogenic Processes and Allogenic Forcings in Set-Scale Aeolian Architecture II: The Scour-and-Fill Dominated Jurassic Page Sandstone, Arizona, U.S.A." Journal of Sedimentary Research 89, no. 8 (August 27, 2019): 741–60. http://dx.doi.org/10.2110/jsr.2019.41.

Full text
Abstract:
Abstract The stratigraphic architecture of aeolian sandstones is thought to record signals originating from both autogenic dune behavior and allogenic environmental boundary conditions within which the dune field evolves. Mapping of outcrop-scale surfaces and sets of cross-strata between these surfaces for the Jurassic Page Sandstone near Page, Arizona, USA, demonstrates that the stratigraphic signature of autogenic behavior is captured by variable scour depths and subsequent fillings, whereas the dominant signatures of allogenic boundary conditions are associated with antecedent surface topography and variable water-table elevations. At the study area, the Page Sandstone ranges from 55 to 65 m thick and is separated from the underlying Navajo Sandstone by the J-2 regional unconformity with meters of relief. Thin, climbing sets of cross-strata of the basal Page representing early dune-field accumulations fill J-2 depressions. In contrast, the overlying lower and middle Page consist of cross-strata ranging from less than 1 to 15 meters thick (average 2.44 m), and packaged between outcrop-scale bounding surfaces, though parts of the lower Page are bounded from beneath by the J-2. These bounding surfaces have been previously correlated to highstand deposits of the adjacent Carmel sea and at this site possess up to 13 meters of erosional relief produced by dune scour. Notably absent in packages of cross-strata bounded by these outcrop-scale surfaces are strata of early dune-field accumulations, any interdune deposits, and climbing-dune strata. Instead, these packages preserve a scour-and-fill architecture created by large dunes migrating in a dry, mature, dune field undergoing negligible bed aggradation. Any record of early phases of dune-field construction for the lower and middle Page are interpreted to have been cannibalized by the deepest scours of later, large dunes. Interpretations are independently supported by the relatively large coefficients of variation (cv) in middle Page set thicknesses (cv = 0.90), which are consistent with set production by successive deepest trough scours, the relatively low coefficient of variation for the depression-filling basal Page and lower Page sets consistent with a significant component of bed aggradation in J-2 depressions (cv = 0.64 and 0.49), and the fit of set thickness distributions to established theory. Numerical modeling presented here and more completely in the companion paper demonstrates how this cannibalization of early-phase stratigraphy is an expected outcome of autogenic dune-growth processes, and that early-phase strata can be preserved within antecedent depressions. Relative rise of the inland water table from basin subsidence and changing Carmel sea level forced preservation of 5–6 stacked packages composed of scour-and-fill architecture. Without these allogenic forcings, the Page would be little more than an erosional surface.
APA, Harvard, Vancouver, ISO, and other styles
16

Dai, Xin Ye, Xiang Ji, Lei Tang, Hai Yan Wang, and Qun Cui. "Adsorption Drying Characteristics of Angelica dahurica and Mathematical Modeling on Large Scale." Advanced Materials Research 807-809 (September 2013): 1960–63. http://dx.doi.org/10.4028/www.scientific.net/amr.807-809.1960.

Full text
Abstract:
The low-temperature adsorption drying characteristics of thin-layer Angelica Dahurica were evaluated in a large scale dryer. The effects of drying medium temperature, relative humidity of drying medium and wind velocity on the drying characteristics were investigated. Page model using nonlinear regression method can preferably fit the drying characteristic of Angelica Dahurica under different drying conditions The work revealed the correlation between moisture content and drying process parameters. The drying coefficient and exponent in the page model can be expressed as a function of temperature, relative humidity and wind velocity of the drying air.
APA, Harvard, Vancouver, ISO, and other styles
17

Jilin, Chen, Zhao Min, Guo Zhonghua, Qiu Weijiang, Chen Yong, and Wang Weixi. "The Application of Douglas-Peucker Algorithm in Collaborative System for Power Grid Operation Mode Calculation." MATEC Web of Conferences 175 (2018): 03041. http://dx.doi.org/10.1051/matecconf/201817503041.

Full text
Abstract:
The collaborative system for power grid operation mode calculation implements data management of multi-level paralleling operated dispatching departments indifferent places and joint operation mode calculation, which is based on the calculated data of power grid operation modes. The transient stability analysis monitors the components of electrical variation curve through the network transmission to the web page. When more users and more curves reach a certain extent, the overall simulation time is longer and the curve is stuck because of the limitation of the transmission bandwidth and the refresh rate of the web page. This paper, The Douglas Peucker algorithm is used in system to solve the data transmission of large amount of data delay problem and achieve the transmission of the large amount of data, the overall simulation time of curve in the web page display is decreased, the curves are refreshed smoothly.
APA, Harvard, Vancouver, ISO, and other styles
18

Chung, Tae-Sun, Dong-Joo Park, and Jongik Kim. "An Efficient Flash Translation Layer for Large Block NAND Flash Devices." Journal of Circuits, Systems and Computers 24, no. 09 (August 27, 2015): 1550138. http://dx.doi.org/10.1142/s0218126615501388.

Full text
Abstract:
Recently, flash memory is widely used as a non-volatile storage for embedded applications such as smart phones, MP3 players, digital cameras and so on. The software layer called flash translation layer (FTL) becomes more important since it is a key factor in the overall flash memory system performance. Many researchers have proposed FTL algorithms for small block flash memory in which the size of a physical page of flash memory is equivalent to the size of a data sector of the file system. However, major flash vendors have now produced large block flash memory in which the size of a physical page is larger than the file system's data sector size. Since large block flash memory has new features, designing FTL algorithms specialized to large block flash memory is a challenging issue. In this paper, we provide an efficient FTL named LSTAFF* for large block flash memory. LSTAFF* is designed to achieve better performance by using characteristics of large block flash memory and to provide safety by abiding by restrictions of large block flash memory. Experimental results show that LSTAFF* outperforms existing algorithms on a large block flash memory.
APA, Harvard, Vancouver, ISO, and other styles
19

Chen, Meng, Cao Wei Chen, Long Shan Wang, Ya Ping Lu, and Lei Zhang. "Research on Remote Optimization System for Key Parts of Large Scale Plastic Mold." Advanced Materials Research 462 (February 2012): 904–8. http://dx.doi.org/10.4028/www.scientific.net/amr.462.904.

Full text
Abstract:
A remote finite element optimization system for key parts of large scale plastic mold is described. The functions of the system are realized using ASP.NET, SQL Server database and ANSYS optimum design. The parameters of geometry, material, load, and optimization are input on the web page. The system can query material properties of the mold, such as modulus of elasticity and Poisson's ratio from the material parameters query system. The key parts of the large scale plastic mold are modeled parametrically by the ANSYS Parametric Design Language (APDL). The codes of parametric modeling are encapsulated on the server. The parametric model reads and optimizes these parameters by the remote server. The optimum result could be browsed on the web page. The system lowers the demands of the relative knowledge about ANSYS which the designer should have.
APA, Harvard, Vancouver, ISO, and other styles
20

Liu, Wen Tao. "Web Page Data Collection Based on Multithread." Applied Mechanics and Materials 347-350 (August 2013): 2575–79. http://dx.doi.org/10.4028/www.scientific.net/amm.347-350.2575.

Full text
Abstract:
The web data collection is the process of collecting the semi-structured, large-scale and redundant data which include web content, web structure and web usage in the web by the crawler and it is often used for the information extraction, information retrieval, search engine and web data mining. In this paper, the web data collection principle is introduced and some related topics are discussed such as page download, coding problem, updated strategy, static and dynamic page. The multithread technology is described and multithread mode for the web data collection is proposed. The web data collection with multithread can get better resource utilization, better average response time and better performance.
APA, Harvard, Vancouver, ISO, and other styles
21

Wishart, M. J., G. Groblewski, B. J. Goke, A. C. Wagner, and J. A. Williams. "Secretagogue regulation of pancreatic acinar cell protein phosphorylation shown by large-scale 2D-PAGE." American Journal of Physiology-Gastrointestinal and Liver Physiology 267, no. 4 (October 1, 1994): G676—G686. http://dx.doi.org/10.1152/ajpgi.1994.267.4.g676.

Full text
Abstract:
High-resolution large-scale two-dimensional polyacrylamide gel electrophoresis (2D-PAGE) combined with computer-assisted image analysis was used to construct a database of secretagogue/second messenger-induced phosphoprotein modifications in intact rat pancreatic acinar cells. Isolated acini were labeled with 32Pi, exposed to hormones and other test agents, and subjected to large-scale 2D-PAGE and autoradiography. This procedure resolved 500 phosphoproteins in pancreatic acinar whole cell lysates, approximately 90% of which were localized in the soluble fraction of centrifuged samples. Soluble proteins were further characterized as to heat and acid stability. Cholecystokinin (CCK), carbachol, and bombesin altered the phosphorylation state of about 27 proteins with both increases and decreases observed. Subsets of proteins were phosphorylated in response to phorbol ester 12-O-tetradecanoylphorbol 13-acetate (TPA), calcium ionophore A-23187, and adenosine 3',5'-cyclic monophosphate (cAMP) analogue 8-bromo-cAMP. One of these proteins was identified as the myristoylated, alanine-rich, C-kinase substrate (MARCKS) protein by immunoprecipitation. The time course and dose response of phosphorylation changes due to CCK showed considerable variation between proteins, although a temporal hierarchy of phosphorylation events was clearly exhibited. Particularly striking was the rapid dephosphorylation within 30 s of a 19-kDa soluble protein to a minimum of 20 +/- 1% of control. Increased phosphorylation of the MARCKS and other TPA-regulated proteins suggests that CCK, carbachol, bombesin, and the CCK partial agonist, JMV-180, all activate protein kinase C in intact acini.
APA, Harvard, Vancouver, ISO, and other styles
22

Van Klinken, B. J., J. Dekker, H. A. Buller, C. de Bolos, and A. W. Einerhand. "Biosynthesis of mucins (MUC2-6) along the longitudinal axis of the human gastrointestinal tract." American Journal of Physiology-Gastrointestinal and Liver Physiology 273, no. 2 (August 1, 1997): G296—G302. http://dx.doi.org/10.1152/ajpgi.1997.273.2.g296.

Full text
Abstract:
Little is known about the biosynthesis of mucin molecules in humans. Our aim was to examine the mucin biosynthesis (MUC2-6) along the longitudinal axis of the healthy human gastrointestinal tract. Biopsies of human stomach and small and large intestine were metabolically labeled with 35S-labeled amino acids, [35S]sulfate, or[3H]galactose, immunoprecipitated with antibodies against MUC2-6, and analyzed by reducing sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE), MUC5AC [apparent molecular weight (M(r)) 500,000] and MUC6 (apparent M(r) 400,000) were detected in the stomach but not in the small or large intestine, MUC3 (apparent M(r) 550,000) was detected in duodenum and jejunum, MUC2 (apparent M(r)600,000) was detected throughout the small and large intestine, and MUC4 (apparent M(r) > 900,000) was detected predominantly in the large intestine. Interestingly, some individuals displayed double bands of MUC2 and MUC3 precursors, suggesting allelic variation within the respective genes. Between small and large intestine mature secreted MUC2 showed differences in mobility on SDS-PAGE, suggesting differences in glycosylation. Each of the MUC2, MUC3, MUC4, MUC5AC, and MUC6 precursors could be distinguished electrophoretically, and each showed region-specific expression along the gastrointestinal tract.
APA, Harvard, Vancouver, ISO, and other styles
23

Sun, Lei, Yan Zeng, and Hong Mei Xing. "Real-Time Bidding Based on MooTools without Refreshing Page." Applied Mechanics and Materials 496-500 (January 2014): 2038–41. http://dx.doi.org/10.4028/www.scientific.net/amm.496-500.2038.

Full text
Abstract:
The link of bidding is the key to large quantities of goods and materials purchasing of the railway. To meet the requirements of the procurement bidding, both sides are required to complete the process of bidding, consultation, and evaluation within the set time online synchronously. This paper introduces the real-time bidding based on MooTools without page refreshed.
APA, Harvard, Vancouver, ISO, and other styles
24

Zhang, Zuping, Jing Zhao, and Xiping Yan. "A Web Page Clustering Method Based on Formal Concept Analysis." Information 9, no. 9 (September 6, 2018): 228. http://dx.doi.org/10.3390/info9090228.

Full text
Abstract:
Web page clustering is an important technology for sorting network resources. By extraction and clustering based on the similarity of the Web page, a large amount of information on a Web page can be organized effectively. In this paper, after describing the extraction of Web feature words, calculation methods for the weighting of feature words are studied deeply. Taking Web pages as objects and Web feature words as attributes, a formal context is constructed for using formal concept analysis. An algorithm for constructing a concept lattice based on cross data links was proposed and was successfully applied. This method can be used to cluster the Web pages using the concept lattice hierarchy. Experimental results indicate that the proposed algorithm is better than previous competitors with regard to time consumption and the clustering effect.
APA, Harvard, Vancouver, ISO, and other styles
25

Ahmad Sabri, Ily Amalina, and Mustafa Man. "Improving Performance of DOM in Semi-structured Data Extraction using WEIDJ Model." Indonesian Journal of Electrical Engineering and Computer Science 9, no. 3 (March 1, 2018): 752. http://dx.doi.org/10.11591/ijeecs.v9.i3.pp752-763.

Full text
Abstract:
<p>Web data extraction is the process of extracting user required information from web page. The information consists of semi-structured data not in structured format. The extraction data involves the web documents in html format. Nowadays, most people uses web data extractors because the extraction involve large information which makes the process of manual information extraction takes time and complicated. We present in this paper WEIDJ approach to extract images from the web, whose goal is to harvest images as object from template-based html pages. The WEIDJ (Web Extraction Image using DOM (Document Object Model) and JSON (JavaScript Object Notation)) applies DOM theory in order to build the structure and JSON as environment of programming. The extraction process leverages both the input of web address and the structure of extraction. Then, WEIDJ splits DOM tree into small subtrees and applies searching algorithm by visual blocks for each web page to find images. Our approach focus on three level of extraction; single web page, multiple web page and the whole web page. Extensive experiments on several biodiversity web pages has been done to show the comparison time performance between image extraction using DOM, JSON and WEIDJ for single web page. The experimental results advocate via our model, WEIDJ image extraction can be done fast and effectively.</p>
APA, Harvard, Vancouver, ISO, and other styles
26

Kou, Gang, and Chunwei Lou. "Multiple factor hierarchical clustering algorithm for large scale web page and search engine clickstream data." Annals of Operations Research 197, no. 1 (February 14, 2010): 123–34. http://dx.doi.org/10.1007/s10479-010-0704-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Kuťka Hlozáková, Timea, Edita Gregová, Svetlana Šliková, Zdenka Gálová, Milan Chňapek, and Janka Drábeková. "Determination of HMW – GS in wheat using SDS – PAGE and Lab-on-chip methods." Potravinarstvo Slovak Journal of Food Sciences 13, no. 1 (June 28, 2019): 477–81. http://dx.doi.org/10.5219/995.

Full text
Abstract:
SDS-PAGE is widely used to determine the amounts of the different gluten protein types. However, this method is time-consuming, especially at early stages of wheat breeding, when large number of samples needs to be analyzed. On the other hand, LoC (Lab-on-Chip) technique has the potential for a fast, reliable, and automatable analysis of proteins. Benefits and limitations of Lab-on-Chip method over SDS-PAGE method in gluten proteins evaluation were explored in order to determine in which way LoC method should be improved in order to make its results more compliant with the results of SDS-PAGE. Chip electrophoresis provides a very good reproducibility of HMW-GS patterns. Moreover this approach is much faster than the conventional SDS-PAGE methods requiring several hours for an analysis. Another advantage over traditional gel electrophoresis is lower sample and reagent volume requirements, as well as specialized protein standards for accurate reproducibility and quantification. In the present study, we identified novel complex allele located at the locus Glu-1B.
APA, Harvard, Vancouver, ISO, and other styles
28

Ma, Jichun, and Di Xia. "The use of blue native PAGE in the evaluation of membrane protein aggregation states for crystallization." Journal of Applied Crystallography 41, no. 6 (November 11, 2008): 1150–60. http://dx.doi.org/10.1107/s0021889808033797.

Full text
Abstract:
Crystallization has long been one of the bottlenecks in obtaining structural information at atomic resolution for membrane proteins. This is largely due to difficulties in obtaining high-quality protein samples. One frequently used indicator of protein quality for successful crystallization is the monodispersity of proteins in solution, which is conventionally obtained by size exclusion chromatography (SEC) or by dynamic light scattering (DLS). Although useful in evaluating the quality of soluble proteins, these methods are not always applicable to membrane proteins either because of the interference from detergent micelles or because of the requirement for large sample quantities. Here, the use of blue native polyacrylamide gel electrophoresis (BN–PAGE) to assess aggregation states of membrane protein samples is reported. A strong correlation is demonstrated between the monodispersity measured by BN–PAGE and the propensity for crystallization of a number of soluble and membrane protein complexes. Moreover, it is shown that there is a direct correspondence between the oligomeric states of proteins as measured by BN–PAGE and those obtained from their crystalline forms. When applied to a membrane protein with unknown structure, BN–PAGE was found to be useful and efficient for selecting well behaved proteins from various constructs and in screening detergents. Comparisons of BN–PAGE with DLS and SEC are provided.
APA, Harvard, Vancouver, ISO, and other styles
29

Riefi, Daifi Afrila, Teuku Yuliar Arif, and Syahrial. "Evaluasi Pengaruh Parameter TIM Berdasarkan Multirate Terhadap Konsumsi Energi Jaringan IEEE 802.11ah." Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) 5, no. 4 (August 20, 2021): 713–20. http://dx.doi.org/10.29207/resti.v5i4.3224.

Full text
Abstract:
WLAN IEEE 802.11ah is wireless standard technology which potentially used for IoT networking to provide longer range transmission than WPAN and LPWAN. MAC layer IEEE 802.11ah introduces TIM segmentation scheme that provides effective management toward STA in large amount to make the energy consumption efficiently. STA is organized in hierarchical structure that allows TIM segmentation to reduce the length of frame beacon contains TIM. In case there’s no segmentation in a network with many STA, the TIM would be longer and requires all STA to wake-up receiving beacon TIM including STA without downlink data. This research intends to evaluate and analyze the TIM optimal parameters. Those are Page Period, Page Slice Length and Page Slice Count toward IEEE 802.11ah energy efficiency based on multirate using simulator NS-3 implemented on IEEE 802.11ah. As the result of STA experiment shows that Non-TIM is only optimal on sleep duration while TIM is optimal on energy consumption and delay packet. In the experiment of impact of STA/Slot amount based on Page Slice Length shows that sleep duration and energy consumption is optimal depends on the amount of the STA/Slot and data rate used while the optimal packet delay varies for each Page Slice Length.
APA, Harvard, Vancouver, ISO, and other styles
30

Radovanovic, Milos, and Mirjana Ivanovic. "Document representations for classification of short web-page descriptions." Yugoslav Journal of Operations Research 18, no. 1 (2008): 123–38. http://dx.doi.org/10.2298/yjor0801123r.

Full text
Abstract:
Motivated by applying Text Categorization to classification of Web search results, this paper describes an extensive experimental study of the impact of bag-of- words document representations on the performance of five major classifiers - Na?ve Bayes, SVM, Voted Perceptron, kNN and C4.5. The texts, representing short Web-page descriptions sorted into a large hierarchy of topics, are taken from the dmoz Open Directory Web-page ontology, and classifiers are trained to automatically determine the topics which may be relevant to a previously unseen Web-page. Different transformations of input data: stemming, normalization, logtf and idf, together with dimensionality reduction, are found to have a statistically significant improving or degrading effect on classification performance measured by classical metrics - accuracy, precision, recall, F1 and F2. The emphasis of the study is not on determining the best document representation which corresponds to each classifier, but rather on describing the effects of every individual transformation on classification, together with their mutual relationships. .
APA, Harvard, Vancouver, ISO, and other styles
31

Matagne, A., B. Joris, and J. M. Frère. "Anomalous behaviour of a protein during SDS/PAGE corrected by chemical modification of carboxylic groups." Biochemical Journal 280, no. 2 (December 1, 1991): 553–56. http://dx.doi.org/10.1042/bj2800553.

Full text
Abstract:
The 29,000-Mr Actinomadura R39 beta-lactamase exhibited a remarkably low electrophoretic mobility on SDS/PAGE, yielding an Mr value almost twice that computed from the corresponding gene sequence. We showed that chemical modification of the carboxylic groups of glutamic acid and aspartic acid residues restored a normal electrophoretic mobility and that the anomalous behaviour of that protein on SDS/PAGE was due to its very large negative charge at neutral pH. We also compared the behaviour of the same enzyme on gel filtration in the presence of SDS with those of other class A beta-lactamases (Mr approx. 30,000). These experiments suggested that the very low electrophoretic mobility of the Actinomadura R39 beta-lactamase upon SDS/PAGE was more probably due to a low degree of SDS binding rather than to an unusual shape of the SDS-protein complex.
APA, Harvard, Vancouver, ISO, and other styles
32

Manning-Miller, Carmen L., and James Crook. "Newspaper Promotions and Coverage of Literacy." Journalism Quarterly 70, no. 1 (March 1993): 118–25. http://dx.doi.org/10.1177/107769909307000113.

Full text
Abstract:
A content analysis of six large U.S. newspapers during the late 1980s shows that their coverage of literacy varied widely, with most coverage getting inside rather than front-page placement. Almost all of the literacy coverage was staff generated.
APA, Harvard, Vancouver, ISO, and other styles
33

Jiang, Ya Qing, Jin Zhang, Zi Jian Yang, Xin Xin Liu, and Rui Lu. "Research on Page Format Streaming Partition in Digital Readings of Comics." Applied Mechanics and Materials 392 (September 2013): 824–29. http://dx.doi.org/10.4028/www.scientific.net/amm.392.824.

Full text
Abstract:
In this paper, we propose a method to solve the problem of parting pages of comics and rear-ranging the consequential partial pictures. It is enable portable devices with small screens to display large page format comics. After being divided into single pictures segmentations, large pages are rearranged in an appropriate way for watching in small screen. The key work in segmentation process is to segment large format comic pages by unicom domain detection. And then the method examined by examples of common comics, proving that the algorithms are effective for comics with integral frames and nearly rectangular partial pictures. We also discuss the picture sequence-restructuring problem with empirical prediction method. Experiments are analyzed by this method and the results of which validates the feasibility of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
34

Gioia, Tania, Anna Galinski, Henning Lenz, Carmen Müller, Jonas Lentz, Kathrin Heinz, Christoph Briese, et al. "GrowScreen-PaGe, a non-invasive, high-throughput phenotyping system based on germination paper to quantify crop phenotypic diversity and plasticity of root traits under varying nutrient supply." Functional Plant Biology 44, no. 1 (2017): 76. http://dx.doi.org/10.1071/fp16128.

Full text
Abstract:
New techniques and approaches have been developed for root phenotyping recently; however, rapid and repeatable non-invasive root phenotyping remains challenging. Here, we present GrowScreen-PaGe, a non-invasive, high-throughput phenotyping system (4 plants min–1) based on flat germination paper. GrowScreen-PaGe allows the acquisition of time series of the developing root systems of 500 plants, thereby enabling to quantify short-term variations in root system. The choice of germination paper was found to be crucial and paper ☓ root interaction should be considered when comparing data from different studies on germination paper. The system is suitable for phenotyping dicot and monocot plant species. The potential of the system for high-throughput phenotyping was shown by investigating phenotypic diversity of root traits in a collection of 180 rapeseed accessions and of 52 barley genotypes grown under control and nutrient-starved conditions. Most traits showed a large variation linked to both genotype and treatment. In general, root length traits contributed more than shape and branching related traits in separating the genotypes. Overall, results showed that GrowScreen-PaGe will be a powerful resource to investigate root systems and root plasticity of large sets of plants and to explore the molecular and genetic root traits of various species including for crop improvement programs.
APA, Harvard, Vancouver, ISO, and other styles
35

Oakley, Kay, Mary Witt, and Robert L. Geneve. "Large Trees for Kentucky Landscapes—An Interactive Extension Publication Available on the World Wide Web." HortScience 32, no. 3 (June 1997): 541A—541. http://dx.doi.org/10.21273/hortsci.32.3.541a.

Full text
Abstract:
An interactive computer version of a traditional Extension educational publication was developed for delivery over the Internet. Large Trees for Kentucky Landscapes is a 40-page publication describing suggested species adapted to Kentucky conditions. It is illustrated with numerous color photographs. This type of Extension publication has a limited distribution because it is relatively expensive to publish. The digital version of this publication allows for inclusion of additional information and illustrations. It was designed to be interactive with the user selecting the species and the information about that species from a screen menu. The user also has the option to print a one page informational sheet on that species. The initial audience for this digital version of the publication is the county Extension agent and Division of Forestry personnel, but it may also be useful at retail horticultural outlets.
APA, Harvard, Vancouver, ISO, and other styles
36

Hayaty, Mardhiya, and Dwi Meylasari. "Implementasi Website Berbasis Search Engine Optimization (SEO) Sebagai Media Promosi." Jurnal Informatika 5, no. 2 (September 15, 2018): 295–300. http://dx.doi.org/10.31311/ji.v5i2.4027.

Full text
Abstract:
AbstrakKemajuan teknologi informasi berkembang secara pesat di berbagai bidang kehidupan. Internet adalah salah satu bagian dari teknologi informasi dan komunikasi mempunyai efek dan pengaruh yang sangat besar. Website salah satu teknologi internet tidak hanya sebagai media informasi tetapi menjadi proses pendukung bisnis perusahaan, akan tetapi penjualan melalui website belum cukup efektif jika tidak didukung dengan strategi promosi yang baik. SEO (Search Engine Optimization) adalah salah satu teknik promosi dengan cara memanfaatkan pengoptimalan mesin pencari agar website yang sudah kita buat berada diperingkat teratas atau halaman pertama (first page) sebuah halaman mesin pencari. Peneltian ini dilakukan pada sebuah website yang awalnya belum dilakukan teknik-teknik dari SEO, kemudian dengan menerapkan metode SEO on Page seperti optimasi keyword pada title tag, content, meta keyword,meta description, dan share ke sosial media, pada tahap ini juga dilakukan beberapa pengujian sebagai tolak ukur keberhasilan penerapan teknik-teknik SEO. Hasil dari penerapan teknik-teknik SEO mampu meningkatkan SERP (Search Engine Results Page) website di mesin pencari dan berhasil terindek oleh google berada di page kedua pada bulan kedua dan berhasil terindex di page pertama dalam pencarian google denga waktu kurang dari 3 bulan. Kata kunci : Website, SEO, Seo on Page, SERP AbstractThe advancement of information technology is growing rapidly in various fields of life. The internet is one part of information and communication technology has a very large effect and influence. The website not only serves as a medium of information but supports the company's business, but sales through the website are not effective enough if it is not supported by a good promotional strategy. SEO (Search Engine Optimization) is a promotional technique by utilizing search engine optimization so that the website we have created is at the top or first page (first page) of a search engine page.This research was conducted on a website that has not been done techniques from SEO, then by applying the SEO on Page method such as keyword optimization on title tags, content, meta keywords, meta description, and sharing to social media, at this stage several tests are also carried out. as a benchmark for the successful application of SEO techniques.The results of the application of SEO techniques are able to increase the SERP (Search Engine Results Page) of websites in search engines and successfully indexed by Google on the second page in the second month and successfully indexed on the first page in google search with less than 3 months. Keywords: Website, SEO, Seo on Page, SERP
APA, Harvard, Vancouver, ISO, and other styles
37

Hayaty, Mardhiya, and Dwi Meylasari. "Implementasi Website Berbasis Search Engine Optimization (SEO) Sebagai Media Promosi." Jurnal Informatika 5, no. 2 (September 15, 2018): 295–300. http://dx.doi.org/10.31294/ji.v5i2.4027.

Full text
Abstract:
AbstrakKemajuan teknologi informasi berkembang secara pesat di berbagai bidang kehidupan. Internet adalah salah satu bagian dari teknologi informasi dan komunikasi mempunyai efek dan pengaruh yang sangat besar. Website salah satu teknologi internet tidak hanya sebagai media informasi tetapi menjadi proses pendukung bisnis perusahaan, akan tetapi penjualan melalui website belum cukup efektif jika tidak didukung dengan strategi promosi yang baik. SEO (Search Engine Optimization) adalah salah satu teknik promosi dengan cara memanfaatkan pengoptimalan mesin pencari agar website yang sudah kita buat berada diperingkat teratas atau halaman pertama (first page) sebuah halaman mesin pencari. Peneltian ini dilakukan pada sebuah website yang awalnya belum dilakukan teknik-teknik dari SEO, kemudian dengan menerapkan metode SEO on Page seperti optimasi keyword pada title tag, content, meta keyword,meta description, dan share ke sosial media, pada tahap ini juga dilakukan beberapa pengujian sebagai tolak ukur keberhasilan penerapan teknik-teknik SEO. Hasil dari penerapan teknik-teknik SEO mampu meningkatkan SERP (Search Engine Results Page) website di mesin pencari dan berhasil terindek oleh google berada di page kedua pada bulan kedua dan berhasil terindex di page pertama dalam pencarian google denga waktu kurang dari 3 bulan. Kata kunci : Website, SEO, Seo on Page, SERP AbstractThe advancement of information technology is growing rapidly in various fields of life. The internet is one part of information and communication technology has a very large effect and influence. The website not only serves as a medium of information but supports the company's business, but sales through the website are not effective enough if it is not supported by a good promotional strategy. SEO (Search Engine Optimization) is a promotional technique by utilizing search engine optimization so that the website we have created is at the top or first page (first page) of a search engine page.This research was conducted on a website that has not been done techniques from SEO, then by applying the SEO on Page method such as keyword optimization on title tags, content, meta keywords, meta description, and sharing to social media, at this stage several tests are also carried out. as a benchmark for the successful application of SEO techniques.The results of the application of SEO techniques are able to increase the SERP (Search Engine Results Page) of websites in search engines and successfully indexed by Google on the second page in the second month and successfully indexed on the first page in google search with less than 3 months. Keywords: Website, SEO, Seo on Page, SERP
APA, Harvard, Vancouver, ISO, and other styles
38

Bukhor, Saiful i. "Analisis Perbandingan Fitur Search Engine." INFORMAL: Informatics Journal 3, no. 1 (February 25, 2019): 17. http://dx.doi.org/10.19184/isj.v3i1.9850.

Full text
Abstract:
Search engines are used in the web as a tool for information retrieval. Web Server is a large warehouse of heterogeneous and unstructured data so that to filter out relevant information from people, a search engine is needed. Search engines usually consist of page repositories, indexing modules, query modules and ranking modules. Search engines do not work alone, besides that there is a web browser that supports the work of this search engine to be more optimal. A browser is software that is run on a user's computer (user) that displays web documents or information taken from a web server [1]. A browser is the type of intermediary the user uses most often. This paper aims to analyze three search engines namely Google, Yahoo, and Bing based on existing features. These features include web search, image search, video search, news search, route search, book search, change search settings, display number of views, shopping, language translator. Google stands as the best search engine among all search engines, which works using the Page Rank algorithm. Page Rank is a numerical value that determines the importance of a web page by calculating the number of backlinks.
APA, Harvard, Vancouver, ISO, and other styles
39

Boiangiu, Costin-Anton, Ovidiu-Alexandru Dinu, Cornel Popescu, Nicolae Constantin, and Cătălin Petrescu. "Voting-Based Document Image Skew Detection." Applied Sciences 10, no. 7 (March 25, 2020): 2236. http://dx.doi.org/10.3390/app10072236.

Full text
Abstract:
Optical Character Recognition (OCR) is an indispensable tool for technology users nowadays, as our natural language is presented through text. We live under the need of having information at hand in every circumstance and, at the same time, having machines understand visual content and thus enable the user to be able to search through large quantities of text. To detect textual information and page layout in an image page, the latter must be properly oriented. This is the problem of the so-called document deskew, i.e., finding the skew angle and rotating by its opposite. This paper presents an original approach which combines various algorithms that solve the skew detection problem, with the purpose of always having at least one to compensate for the others’ shortcomings, so that any type of input document can be processed with good precision and solid confidence in the output result. The tests performed proved that the proposed solution is very robust and accurate, thus being suitable for large scale digitization projects.
APA, Harvard, Vancouver, ISO, and other styles
40

Khafendi, Khafendi. "Kajian Pemindahan Penumpukan Peti Kemas (Overbrengen) Ke Tempat Penampungan Sementara Di Pelabuhan Tanjung Priok." Warta Penelitian Perhubungan 22, no. 6 (June 30, 2010): 643–56. http://dx.doi.org/10.25104/warlit.v22i6.1099.

Full text
Abstract:
Port ofTanjung Priok, including the Jakarta International Container Terminal (JICT) and TerminalContainer (TPK) Koja is still ouersluuiowed lnj the potentinl density of container, container imports,especial.Ly in the yard because of among others, the ourners of the goods out slow and limited goodspage and ease of transportation where . bemuse of slow spending container and goods, especiallyimports of container. Page Contniner capacity is not proportional to the volume of imported goodsthat come in, so that the Yard Occupanci; Ratio (Jordan) could not be maintained in safe condition, sothat the importer is required to delete or mmJe a crate container page (cnJerbrengen)Causes of densiti; in the yard, especially on big dm;s like Eid, because, many basic needs before majorJwlidm;s or dm;s, as ItJell as a large carrier equipment (trucks) are forbidden to operate on the mainroad or track bnck and forth before the lwlidny or dny large/ Eid lwlidm;s and after (H-7 and H +),research was conducted with the compamtin~ 111et/10d to the field cumulation in when the big dm; andduring a nonnal dm;. To cnJercome the density of buildup, it slwuld be done several wm;s, amongothers, ln; the removal of container from page to another place (cnierbrengen) and increase the accumulationof container faci.lities at the port.Keywords : Container yard, Overurengen
APA, Harvard, Vancouver, ISO, and other styles
41

Ndofor-Foleng, HM, OG Iloghalu, MO Onodugo, and AG Ezekwe. "Genetic diversity between large white and nigerian indigenous breed of swine using polyacrylamide gel electrophoresis (PAGE)." Agro-Science 13, no. 3 (June 19, 2015): 30. http://dx.doi.org/10.4314/as.v13i3.5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Anthony, R. J., H. V. Speechley, V. LeBrunn, M. Phillip, and O. J. Corrado. "14DEVELOPMENT OF A DEMENTIA INTRANET PAGE TO RAISE AWARENESS OF DEMENTIA WITHIN A LARGE TEACHING HOSPITAL." Age and Ageing 44, suppl 2 (September 2015): ii4.1—ii4. http://dx.doi.org/10.1093/ageing/afv106.14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Wang, Hua, Feiping Nie, and Heng Huang. "Large-Scale Cross-Language Web Page Classification via Dual Knowledge Transfer Using Fast Nonnegative Matrix Trifactorization." ACM Transactions on Knowledge Discovery from Data 10, no. 1 (July 27, 2015): 1–29. http://dx.doi.org/10.1145/2710021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Kravchenko, Andrey. "Large-scale holistic approach to Web block classification: assembling the jigsaws of a Web page puzzle." World Wide Web 22, no. 5 (September 12, 2018): 1999–2015. http://dx.doi.org/10.1007/s11280-018-0634-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Kline, R. J., D. N. Leonard, A. D. Batchelor, and P. E. Russell. "Scanning Probe Microscopy: Internet Resource Development and Integration into Undergraduate Curriculum." Microscopy and Microanalysis 3, S2 (August 1997): 1279–80. http://dx.doi.org/10.1017/s1431927600013283.

Full text
Abstract:
The Internet has become a very valuable educational resource. It allows a person to be able to reach a very large, diverse audience across the world with ease. With NSF Combined Research-Curriculum Development (CRCD) funding, we have begun to use the Internet as an educational and technical resource for people wanting to learn about Scanning Probe Microscopy (SPM). We have set up a web page with informative information for people of all levels of SPM knowledge. We are actively combining SPM research and education into the materials science and engineering undergraduate curriculum. We also use the web page as a way to publish our findings to help other universities integrate SPM into their curriculums.The URL is: http://spm.aif.ncsu.eduThe web page is divided into seven main components. Each component has a specific intended audience and purpose. We have designed some components for people who have never heard of SPM and others for people who run SPM labs.
APA, Harvard, Vancouver, ISO, and other styles
46

Patel, Chandrakant D., and Jayesh M. Patel. "Influence of GUJarati STEmmeR in Supervised Learning of Web Page Categorization." International Journal of Intelligent Systems and Applications 13, no. 3 (June 8, 2021): 23–34. http://dx.doi.org/10.5815/ijisa.2021.03.03.

Full text
Abstract:
With the large quantity of information offered on-line, it's equally essential to retrieve correct information for a user query. A large amount of data is available in digital form in multiple languages. The various approaches want to increase the effectiveness of on-line information retrieval but the standard approach tries to retrieve information for a user query is to go looking at the documents within the corpus as a word by word for the given query. This approach is incredibly time intensive and it's going to miss several connected documents that are equally important. So, to avoid these issues, stemming has been extensively utilized in numerous Information Retrieval Systems (IRS) to extend the retrieval accuracy of all languages. These papers go through the problem of stemming with Web Page Categorization on Gujarati language which basically derived the stem words using GUJSTER algorithms [1]. The GUJSTER algorithm is based on morphological rules which is used to derived root or stem word from inflected words of the same class. In particular, we consider the influence of extracted a stem or root word, to check the integrity of the web page classification using supervised machine learning algorithms. This research work is intended to focus on the analysis of Web Page Categorization (WPC) of Gujarati language and concentrate on a research problem to do verify the influence of a stemming algorithm in a WPC application for the Gujarati language with improved accuracy between from 63% to 98% through Machine Learning supervised models with standard ratio 80% as training and 20% as testing.
APA, Harvard, Vancouver, ISO, and other styles
47

Rohde, Palle Duun, Izel Fourie Sørensen, and Peter Sørensen. "qgg: an R package for large-scale quantitative genetic analyses." Bioinformatics 36, no. 8 (December 27, 2019): 2614–15. http://dx.doi.org/10.1093/bioinformatics/btz955.

Full text
Abstract:
Abstract Summary Here, we present the R package qgg, which provides an environment for large-scale genetic analyses of quantitative traits and diseases. The qgg package provides an infrastructure for efficient processing of large-scale genetic data and functions for estimating genetic parameters, and performing single and multiple marker association analyses and genomic-based predictions of phenotypes. Availability and implementation The qgg package is freely available. For the latest updates, user guides and example scripts, consult the main page http://psoerensen.github.io/qgg. The current release is available from CRAN (https://CRAN.R-project.org/package=qgg) for all major operating systems. Supplementary information Supplementary data are available at Bioinformatics online.
APA, Harvard, Vancouver, ISO, and other styles
48

Reason, D. C. "A murine hybridoma with large cytoplasmic inclusions of kappa light chains." Journal of Experimental Medicine 165, no. 2 (February 1, 1987): 578–83. http://dx.doi.org/10.1084/jem.165.2.578.

Full text
Abstract:
A murine hybridoma cell line has been established that consistently forms large cytoplasmic inclusions. These structures bind antibody specific for mouse kappa L chain when stained in situ. SDS-PAGE analysis of isolated inclusion bodies produce a single protein band of approximately 26,000 Mr that reacts with anti-kappa antibody when transferred to nitrocellulose. No carbohydrate was detected in association with the purified protein. These data are consistent with the intracellular retention and deposition of complete kappa L chain protein.
APA, Harvard, Vancouver, ISO, and other styles
49

Sens, Alexander. "Hedylus (4 and 5 Gow–Page) and Callimachean Poetics." Mnemosyne 68, no. 1 (January 20, 2015): 40–52. http://dx.doi.org/10.1163/1568525x-12301478.

Full text
Abstract:
The phrase λεπτὸν καί τι μελιχρὸν ἔπος in Hedylus 5 Gow–Page has been read as engaging with Callimachean esthetic language, though its precise significance has been debated. This paper argues that Hedylus’ engagement with Callimachean esthetic imagery and language is best understood by juxtaposing Hedylus 4 and 5 Gow–Page. The structure of the former, on a gold rhyton dedicated to Arsinoe Zephyritis, pointedly treats two Egyptian deities—one miniature, the other colossal—in language evocative of poetic composition, and does so in a way that effaces the bright oppositions between large and small in the prologue to Callimachus’ Aetia. At the same time, the poem identifies sounds made by wine with sounds made by water, and thus sheds light on Hedylus’ treatment of wine as a source of poetic inspiration in both epigrams. Far from being a rebuttal of Callimachean values, these poems appropriate and adapt his esthetic imagery and language to the genre of epigram.
APA, Harvard, Vancouver, ISO, and other styles
50

BEAUMONT, NICHOLAS. "FITTING A TABLE TO A PAGE USING NONLINEAR OPTIMIZATION." Asia-Pacific Journal of Operational Research 21, no. 02 (June 2004): 259–70. http://dx.doi.org/10.1142/s0217595904000230.

Full text
Abstract:
It is sometimes difficult to fit a large table comprising several rows and columns onto a page. The usual tactic is to manually adjust column widths, abbreviate some text, and/or change some cells' font sizes until the table fits onto a page. We show that it is possible to express the problem of adjusting column widths so as to minimize the height of a table as an optimization problem with nonlinear constraints. Five test problems were routinely solved using a free software package. We stress that the solutions are approximate because the model imperfectly simulates how many lines of a cell of a table will be required to contain a segment of text, but they appear to provide good approximations in difficult cases. The scant literature is summarized; the formulation and solution techniques outlined; examples are described; and differences between theoretical and actual answers explained. It would be possible to incorporate the calculations in word processing and typesetting packages such as Word and TeX.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography