首页期刊导航|Robotics & Machine Learning Daily News
期刊信息/Journal information
Robotics & Machine Learning Daily News
NewsRx
Robotics & Machine Learning Daily News

NewsRx

Robotics & Machine Learning Daily News/Journal Robotics & Machine Learning Daily News
正式出版
收录年代

    Data on Machine Learning Reported by Researchers at University of Groningen (Floppity : Enabling Self-consistent Exoplanet Atmospheric Retrievals With Machine Learning)

    1-2页
    查看更多>>摘要:Fresh data on Machine Learning are presented in a new report. According to news reporting originating in Groningen, Netherlands, by NewsRx journalists, research stated, “Interpreting the observations of exoplanet atmospheres to constrain physical and chemical properties is typically done using Bayesian retrieval techniques. Since these methods require many model computations, a compromise must be made between the model's complexity and its run time.” Funders for this research include Center for Information Technology of the University of Groningen, European Union (EU), Science & Technology Facilities Council (STFC).

    Researcher at Financial University under the Government of the Russian Federation Describes Research in Machine Learning (Detection of internal security incidents in cyberphysical systems)

    2-2页
    查看更多>>摘要:Researchers detail new data in artificial intelligence. According to news reporting out of Financial University under the Government of the Russian Federation by NewsRx editors, research stated, “This paper addresses the issue of internal security breaches in cyber-physical systems framing it as an anomaly detection problem within the framework of machine learning models.” Our news journalists obtained a quote from the research from Financial University under the Government of the Russian Federation: “The use of powerful mathematical apparatus embedded in the structure of machine learning models, including models based on artificial neural networks, allows building an autonomous system for detecting internal security breaches with minimal reliance on expert assessments. The determination of user abnormality is made on the basis of average data on log entries of actions in the system identified as abnormal, as well as on statistical data on the number of such entries for each user.”

    CVR College of Engineering Researchers Have Provided New Data on Support Vector Machines (Enhancing Sentiment Analysis Accuracy by Optimizing Hyperparameters of SVM and Logistic Regression Models)

    2-3页
    查看更多>>摘要:Research findings on support vector machines are discussed in a new report. According to news originating from CVR College of Engineering by NewsRx editors, the research stated, “The Analysis of Sentiments expressed on Twitter is a widely practiced application of Natural Language Processing (NLP) and Artificial Intelligence (AI).” The news editors obtained a quote from the research from CVR College of Engineering: “This process involves examining tweets to determine the emotional tone conveyed within the message. AI-based approaches are employed in Twitter sentiment analysis, typically following these steps: Data Collection, Data Preprocessing, and Sentiment Analysis, where AI techniques like Support Vector Machines (SVM) and Logistic Regression are utilized to categorize tweets into positive, negative, or neutral sentiments. Twitter data is a valuable source of information, serving diverse purposes such as real-time updates, user feedback, brand monitoring, market research, digital marketing, and political analysis. The Twitter API (Application Programming Interface) provides developers with tools and functionalities to access and interact with Twitter data, including tweets, user profiles, and timelines, enabling a wide range of applications and services. However, Twitter sentiment analysis presents challenges such as handling sarcasm, irony, colloquial language, and coping with the sheer volume and rapid flow of Twitter data.”

    Research Institutes of Sweden (RISE) Reports Findings in Machine Learning (Interpretable machine learning for predicting the fate and transport of pentachlorophenol in groundwater)

    3-4页
    查看更多>>摘要:New research on Machine Learning is the subject of a report. According to news reporting out of Goteborg, Sweden, by NewsRx editors, research stated, “Pentachlorophenol (PCP) is a commonly found recalcitrant and toxic groundwater contaminant that resists degradation, bioaccumulates, and has a potential for long-range environmental transport. Taking proper actions to deal with the pollutant accounting for the life cycle consequences requires a better understanding of its behavior in the subsurface.” Our news journalists obtained a quote from the research from the Research Institutes of Sweden (RISE), “We recognize the huge potential for enhancing decision-making at contaminated groundwater sites with the arrival of machine learning (ML) techniques in environmental applications. We used ML to enhance the understanding of the dynamics of PCP transport properties in the subsurface, and to determine key hydrochemical and hydrogeological drivers affecting its transport and fate. We demonstrate how this complementary knowledge, provided by data-driven methods, may enable a more targeted planning of monitoring and remediation at two highly contaminated Swedish groundwater sites, where the method was validated. We evaluated 6 interpretable ML methods, 3 linear regressors and 3 non-linear (i.e., treebased) regressors, to predict PCP concentration in the groundwater. The modeling results indicate that simple linear ML models were found to be useful in the prediction of observations for datasets without any missing values, while tree-based regressors were more suitable for datasets containing missing values. Considering that missing values are common in datasets collected during contaminated site investigations, this could be of significant importance for contaminated site planners and managers, ultimately reducing site investigation and monitoring costs. Furthermore, we interpreted the proposed models using the SHAP (SHapley Additive exPlanations) approach to decipher the importance of different drivers in the prediction and simulation of critical hydrogeochemical variables. Among these, sum of chlorophenols is of highest significance in the analyses. Setting that aside from the model, tetra chlorophenols, dissolved organic carbon, and conductivity found to be of highest importance.”

    New Robotics Study Findings Have Been Reported from Nanjing University of Science and Technology (Robust Adaptive Flexible Prescribed Performance Tracking and Vibration Control for Rigidflexible Coupled Robotic Systems With Input Quantization)

    4-5页
    查看更多>>摘要:Research findings on Robotics are discussed in a new report. According to news reporting out of Jiangsu, People's Republic of China, by NewsRx editors, research stated, “With the increasing demands for more flexibility, lighter weight, and larger working space of industrial robotic systems in many fields, the rigid-flexible coupled robotic systems attract more attention. In this work, the desired angular tracking and vibration suppression issues are investigated for the rigid-flexible coupled robotic systems (RFCRSs) in the presence of input quantization.” Financial supporters for this research include National Natural Science Foundation of China (NSFC), international science and technology innovation cooperation key project.

    St Thomas' Hospital Reports Findings in Inflammatory Bowel Disease (Implementation of a robotic surgical practice in inflammatory bowel disease)

    5-6页
    查看更多>>摘要:New research on Digestive System Diseases and Conditions - Inflammatory Bowel Disease is the subject of a report. According to news reporting from London, United Kingdom, by NewsRx journalists, research stated, “Robotics adoption has increased in colorectal surgery. While there are wellestablished advantages and standardised techniques for cancer patients, the use of robotic surgery in inflammatory bowel disease (IBD) has not been studied yet.” The news correspondents obtained a quote from the research from St Thomas' Hospital, “To evaluate the feasibility and safety of robotic surgery for IBD patients. Prospectively data in IBD patients having robotic resection at Guy's and St Thomas' hospital. All resections performed by a single colorectal surgeon specialised in IBD, utilising DaVinci platform. July 2021 to January 2023, 59 robotic IBD cases performed, 14 ulcerative colitis (UC) and 45 Crohn's disease (CD). Average age; CD patients 35, UC 33 years. Average Body mass index (BMI); 23 for CD and 26.9 for UC patients. In total, we performed 31 ileo-caecal resections (ICR) with primary anastomosis (18 Kono-S anastomosis, 6 mechanical anastomosis and 7 ileo-colostomy), of those 4 had multivisceral resections (large bowel, bladder, ovary). Furthermore, 14 subtotal colectomy (1 emergency), 8 proctectomy, 3 panproctocolectomy and 3 ileoanal J pouch. 18 of the 45 patients (45.0%) with Crohn's disease had ongoing fistulating disease to other parts of the GI tract (small or large bowel). ICR were performed using different three ports position, depending on the anatomy established prior to surgery with magnetic resonance images (MRI). One patient had conversion to open due to anaesthetic problems and one patient required re-operation to refashion stoma. 98.0% cases completed robotically. Median Length of hospital stay (LOS) was 7 days for CD and 7 for UC cases, including LOS in patients on pre-operative parenteral nutrition. Robotic colorectal techniques can be safely used for patients with IBD, even with fistulating disease.”

    Data on Artificial Intelligence Reported by H. Lockman and Colleagues (The potential role of artificial intelligence-assisted chest Xray imaging in detecting early-stage lung cancer in the community-a proposed algorithm for lung cancer ...)

    6-7页
    查看更多>>摘要:New research on Artificial Intelligence is the subject of a report. According to news reporting from Selangor, Malaysia, by NewsRx journalists, research stated, “The poor prognosis of lung cancer has been largely attributed to the fact that most patients present with advanced stage disease. Although low dose computed tomography (LDCT) is presently considered the optimal imaging modality for lung cancer screening, its use has been hampered by cost and accessibility.” The news correspondents obtained a quote from the research, “One possible approach to facilitate lung cancer screening is to implement a risk-stratification step with chest radiography, given its ease of access and affordability. Furthermore, implementation of artificial-intelligence (AI) in chest radiography is expected to improve the detection of indeterminate pulmonary nodules, which may represent early lung cancer. This consensus statement was formulated by a panel of five experts of primary care and specialist doctors. A lung cancer screening algorithm was proposed for implementation locally. In an earlier pilot project collaboration, AI-assisted chest radiography had been incorporated into lung cancer screening in the community. Preliminary experience in the pilot project suggests that the system is easy to use, affordable and scalable. Drawing from experience with the pilot project, a standardised lung cancer screening algorithm using AI in Malaysia was proposed. Requirements for such a screening programme, expected outcomes and limitations of AI-assisted chest radiography were also discussed. The combined strategy of AI-assisted chest radiography and complementary LDCT imaging has great potential in detecting early-stage lung cancer in a timely manner, and irrespective of risk status.”

    Reports on Artificial Intelligence Findings from Zhejiang Gongshang University Provide New Insights (Artificial Intelligence Application, Global Value Chain Reconstruction and Enterprise Knowledge Power Relationship: Grounded Theory Research ...)

    7-8页
    查看更多>>摘要:Investigators publish new report on artificial intelligence. According to news reporting originating from Zhejiang Gongshang University by NewsRx correspondents, research stated, “In the process of application of artificial intelligence affecting the formation of enterprise knowledge power, there are many factors that can influence it.” The news editors obtained a quote from the research from Zhejiang Gongshang University: “However, there is no mature theoretical framework for the specific important influencing factors. The TongKun Group in Zhejiang province is a 'future intelligence factory' that uses artificial intelligence and other advanced technologies to realize the global value chain reconstruction and enterprise knowledge power. Based on the grounded theory. The research data are collected from Tongkun company. According to the research process of grounded theory, original interview data are tested for open coding, spindle coding, selective coding and theoretical saturation. Using coding, categories are summarized into five main categories: enterprise artificial intelligence technology foundation, enterprise artificial intelligence technology usage, enterprise artificial intelligence technology results (output), global value chain reconstruction, and enterprise knowledge power.”

    Zhejiang University School of Medicine Reports Findings in Congenital Heart Disease [A Patient Similarity Network (CHDmap) to Predict Outcomes After Congenital Heart Surgery: Development and Validation Study]

    8-9页
    查看更多>>摘要:New research on Congenital Diseases and Conditions - Congenital Heart Disease is the subject of a report. According to news reporting out of Hangzhou, People's Republic of China, by NewsRx editors, research stated, “Although evidence-based medicine proposes personalized care that considers the best evidence, it still fails to address personal treatment in many real clinical scenarios where the complexity of the situation makes none of the available evidence applicable. 'Medicine-based evidence' (MBE), in which big data and machine learning techniques are embraced to derive treatment responses from appropriately matched patients in real-world clinical practice, was proposed. However, many challenges remain in translating this conceptual framework into practice.”

    New Intelligent Systems Findings Reported from National University of Defense Technology (Disentangled Variational Auto-encoder Enhanced By Counterfactual Data for Debiasing Recommendation)

    9-10页
    查看更多>>摘要:Investigators publish new report on Machine Learning - Intelligent Systems. According to news originating from Changsha, People's Republic of China, by NewsRx correspondents, research stated, “Recommender system always suffers from various recommendation biases, seriously hindering its development. In this light, a series of debias methods have been proposed in the recommender system, especially for two most common biases, i.e., popularity bias and amplified subjective bias.” Financial support for this research came from Independent Innovation Science foundation project of National University of Defense Technology. Our news journalists obtained a quote from the research from the National University of Defense Technology, “However, existing debias methods usually concentrate on correcting a single bias. Such single-functionality debiases neglect the bias-coupling issue in which the recommended items are collectively attributed to multiple biases. Besides, previous work cannot tackle the lacking supervised signals brought by sparse data, yet which has become a commonplace in the recommender system. In this work, we introduce a disentangled debias variational auto-encoder framework (DB-VAE) to address the single-functionality issue as well as a counterfactual data enhancement method to mitigate the adverse effect due to the data sparsity. In specific, DB-VAE first extracts two types of extreme items only affected by a single bias based on the collier theory, which are, respectively, employed to learn the latent representation of corresponding biases, thereby realizing the bias decoupling. In this way, the exact unbiased user representation can be learned by these decoupled bias representations. Furthermore, the data generation module employs Pearl's framework to produce massive counterfactual data to help fully train the model, making up the lacking supervised signals due to the sparse data. Extensive experiments on three real-world data sets demonstrate the effectiveness of our proposed model. Specifically, our model outperforms the best baseline by 19.5% in terms of Recall@20 and 9.5% in terms of NDCG@100 in the best scenario.”