When:
May 23 – 24, 2019
Where:
The event will be held in the École Polytechnique de Montréal Pavillon Principal, tentatively room C-631. Montréal is the largest city in the Canadian province of Québec and the second-largest city in Canada. It is the second largest, primarily French-speaking city in the world, after Paris. The official language of Montreal is French, as defined by the city’s charter, yet most of its citizen are bilingual and it is always possible to study, work, and enjoy Montréal in English.
How to reach the meeting
2500, Chemin de Polytechnique
Montreal, Quebec, Canada
H3T 1J4
(Civic address)
See the picture:

Picture of Pavillon Principal
WIFI Connection
Polytechnique Montreal is part of the eduroam (education roaming) consortium; eduroam is the secure, world-wide roaming access service developed for the international research and education community. eduroam allows students, researchers and staff from participating institutions to obtain Internet connectivity across campus and when visiting other participating institutions by simply opening their laptop.
If you are a faculty member, student or have access to eduroam, please make sure you activate your eduroam account at your institution , this will give to access to WIFI at Polytechnique Montreal and thus to PLOW.
Travel, Accommodation and VISAs
PLOW organizers do not provide any support for travel, accommodation and VISA. Some international visitors will need to get a Temporary Resident Visa (TRV) to visit Canada. To get the latest and official information on Canada visas and to see if you need a visa please visit:
http://www.cic.gc.ca/english/visit/visas.asp
see also Citizenship and Immigration Canada (CIC) web site:
http://www.cic.gc.ca/english/index.asp
How to Get the TRV?
Information on visiting Canada:
http://www.cic.gc.ca/english/visit/index.asp
Visiting Canada – Important information for visa exempt travellers (including US Citizens):
http://www.cic.gc.ca/english/visit/visa-exempt.asp
IMPORTANT: no invitation letter or VISA support letter will be provided by PLOW organizers . Should you need an accommodation you may consider the many hotel downtown Montreal (40 min by metro) or the University of Montreal studio accommodations located at a walking distance from Polytechique Montreal.

















Abstract: Anomaly detection plays an important role in management of modern large-scale distributed systems. Logs, which record system runtime information, are widely used for anomaly detection. However, Unsupervised anomaly detection algorithms face challenges in addressing complex systems, which generate vast amounts of multivariate time series data. Timely anomaly detection is crucial for managing these systems effectively and minimizing downtime. This proactive approach minimizes system downtime and plays a vital role in incident management for large-scale systems. To address these challenges, a method called Multi-Scale Convolutional Recurrent Encoder-Decoder (MSCRED) has been developed for detecting anomalies in CN PTC system logs. MSCRED leverages the power of multivariate time series data to perform anomaly detection and diagnosis. It creates multi-scale signature matrices that capture different levels of system statuses across various time steps. The method utilizes a convolutional encoder to capture inter-sensor correlations and a Convolutional Long-Short Term Memory (ConvLSTM) network with attention mechanisms to capture temporal patterns.
Abstract
Abstract: Language models such as RoBERTa, CodeBERT, and GraphCodeBERT have gotten much attention in the past three years for various Software Engineering tasks. Though these models are proven to have state-of-the-art performance for many SE tasks, such as code summarization, they often require to be fully fine-tuned for the downstream task. Is there a better way for fine-tuning these models that require training fewer parameters? Can we impose new information on the current models without pre-training them again? How do these models perform for different programming languages, especially low-resource ones with less training data available? How can we use the knowledge learned from other programming languages to improve the performance of low-resource languages? This talk will review a series of experiments and our contributions to answering these questions.











