<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/" version="2.0" xml:lang="ja">
	<channel>
		<title>HASCA2026</title>
		<link>http://hasca2026.hasc.jp/</link>
		<atom:link href="http://hasca2026.hasc.jp/rss2.xml" rel="self" type="application/rss+xml" />
		<description></description>
		<language>ja</language>
		<copyright>Copyright (C) 2026 HASCA2026 All rights reserved.</copyright>
		<lastBuildDate>Mon, 11 May 2026 19:10:44 +0900</lastBuildDate>
		<generator>a-blog cms</generator>
		<docs>http://blogs.law.harvard.edu/tech/rss</docs>
		<item>
			<dc:creator>kawaguti</dc:creator>
			<title>Welcome to HASCA2026</title>
			<link>http://hasca2026.hasc.jp/index.html</link>
			<description><![CDATA[
			<div class="newsTextBox">
			
				
				
				<h2 id="h754">Welcome to HASCA2026 Web site!</h2>
				

				
			
				
				
				<p>HASCA2026 is an 14th International Workshop on Human Activity Sensing Corpus and Applications. The workshop will be held in conjunction with <a href="https://www.ubicomp.org/ubicomp-iswc-2026/" targe="_blank">UbiComp/ISWC2026</a>.</p>

<p><strong>Important Dates</strong><br>
Submission Deadline: June 15 *tentative<br>
Acceptance Notification: July 16 *tentative<br>
Camera-ready: July 31 *tentative<br>
Workshop: Oct 11 or 12 *tentative<br></p>

<p>For SHL Challenge and WEAR Challenge, please check each challenge's conditions as dates may differ.</p>

				

				
			
				
				
				<h2 id="h756">Challenges</h2>
				

				
			
				
				
				<p >Following challenges are held with HASCA 2026.<br />
Please refer to each challenge website for details including rules and deadlines.<br />
<br />
Sussex-Huawei Locomotion (SHL) Challenge<br />
<a href="http://www.shl-dataset.org/challenge-2026/" target="_blank">http://www.shl-dataset.org/challenge-2026/</a><br />
<br />
WEAR Dataset Challenge<br />
<a href="https://mariusbock.github.io/wear/challenge.html" target="_blank">https://mariusbock.github.io/wear/challenge.html</a></p>
				

				
			
				
				
				<h2 id="h758">Abstract</h2>
				

				
			
				
				
				<p>The recognition of complex and subtle human behaviors from wearable sensors will enable next-generation human-oriented computing in scenarios of high societal value (e.g., dementia care). This will require large-scale human activity corpora and improved methods to recognize activities and the context in which they occur. This workshop deals with the challenges of designing reproducible experimental setups, running large-scale dataset collection campaigns, designing activity and context recognition methods that are robust and adaptive, and evaluating systems in the real world. We wish to reflect on future methods, such as lifelong learning approaches that allow open-ended activity recognition. The objective of this workshop is to share the experiences among current researchers around the challenges of real-world activity recognition, the role of datasets and tools, and breakthrough approaches towards open-ended contextual intelligence.</p>

<p>The objective of this workshop is to share the experiences among current researchers around the challenges of real-world activity recognition, the role of datasets and tools, and breakthrough approaches towards open-ended contextual intelligence. We expect the following domains to be relevant contributions to this workshop (but not limited to):</p>

				

				
			
				
				
				<h2 id="h760">Data collection / Corpus construction</h2>
				

				
			
				
				
				<p>Experiences or reports from data collection and/or corpus construction projects, such as papers describing the formats, styles or methodologies for data collection. Cloud- sourcing data collection or participatory sensing also could be included in this topic.</p>

				

				
			
				
				
				<h2 id="h762">Effectiveness of Data / Data Centric Research</h2>
				

				
			
				
				
				<p>There is a field of research based on the collected corpus, which is called “Data Centric Research”. Also, we solicit of the experience of using large-scale human activity sensing corpus. Using large-scape corpus with machine learning, there will be a large space for improving the performance of recognition results.</p>

				

				
			
				
				
				<h2 id="h764">Tools and Algorithms for Activity Recognition</h2>
				

				
			
				
				
				<p>If we have appropriate and suitable tools for management of sensor data, activity recognition researchers could be more focused on their research theme. However, development of tools or algorithms for sharing among the research community is not much appreciated. In this workshop, we solicit development reports of tools and algorithms for forwarding the community.</p>

				

				
			
				
				
				<h2 id="h766">Real World Application and Experiences</h2>
				

				
			
				
				
				<p>Activity recognition "in the Lab" usually works well. However, it is not true in the real world. In this workshop, we also solicit the experiences from real world applications. There is a huge gap/valley between "Lab Envi- ronment" and "Real World Environment". Large scale human activity sensing corpus will help to overcome this gap/valley.</p>

				

				
			
				
				
				<h2 id="h768">Sensing Devices and Systems</h2>
				

				
			
				
				
				<p>Data collection is not only performed by the "off the shelf" sensors. There is a requirement to develop some special devices to obtain some sort of information. There is also a research area about the development or evaluate the system or technologies for data collection.</p>

				

				
			
				
				
				<h2 id="h770">Mobile experience sampling, experience sampling strategies: </h2>
				

				
			
				
				
				<p >Advances in experience sampling ap- proaches, for instance intelligently querying the user or using novel devices (e.g. smartwatches) are likely to play an important role to provide user-contributed annotations of their own activities.</p>
				

				
			
				
				
				<h2 id="h772">Unsupervised pattern discovery</h2>
				

				
			
				
				
				<p >Discovering mean- ingful repeating patterns in sensor data can be fundamental in informing other elements of a system generating an activity corpus, such as inquiring user or triggering annotation crowd sourcing.</p>
				

				
			
				
				
				<h2 id="h774">Dataset acquisition and annotation through crowd-sourcing, web-mining</h2>
				

				
			
				
				
				<p >A wide abundance of sensor data is potentially in reach with users instrumented with their mobile phones and other wearables. Capitalizing on crowd-sourcing to create larger datasets in a cost effective manner may be critical to open-ended activity recognition. Online datasets could also be used to bootstrap recognition models.</p>
				

				
			
				
				
				<h2 id="h776">Transfer learning, semi-supervised learning, lifelong learning</h2>
				

				
			
				
				
				<p >The ability to translate recognition mod- els across modalities or to use minimal supervision would allow to reuse datasets across domains and reduce the costs of acquiring annotations.</p>
				

				

				<br class="clearHidden" />
			</div>
			]]></description>
			<guid isPermaLink="true">http://hasca2026.hasc.jp/index.html</guid>
			<pubDate>Mon, 11 May 2026 17:26:53 +0900</pubDate>
		</item>
		<item>
			<dc:creator>kawaguti</dc:creator>
			<title>Call for Contributions</title>
			<link>http://hasca2026.hasc.jp/cfp/index.html</link>
			<description><![CDATA[
			<div class="newsTextBox">
			
				
				
				<p >We are pleased to announce that the HASCA (Human Activity Sensing Corpus and Applications) Workshop will take place as part <a href="https://www.ubicomp.org/ubicomp-iswc-2026/" target="_blank">Ubicomp2026</a>.<br />
HASCA is one of the largest workshops in Ubicomp, it has been held over 14 years.</p>
				

				
			
				
				
				<h2 id="h731">Dates</h2>
				

				
			
				
				
				<p >Submission Deadline: June 15 *tentative<br />
Acceptance Notification: July 16 *tentative<br />
Camera-ready: July 31 *tentative<br />
Workshop: Oct 11 or 12 *tentative<br />
<br />
For SHL Challenge and WEAR challenge, please check each challenge's conditions as dates may differ.</p>
				

				
			
				
				
				<h2 id="h733">SUMMARY</h2>
				

				
			
				
				
				<p >The objective of this workshop is to share the experiences among<br />
researchers about current challenges of real-world activity<br />
recognition with newly developed datasets and tools, breaking through<br />
towards open-ended contextual intelligence.<br />
<br />
This workshop discusses the challenges of designing reproducible<br />
experimental setups, the large-scale dataset collection campaigns, the<br />
activity and context recognition methods that are robust and adaptive,<br />
and evaluation systems in the real world.<br />
<br />
As a special topic of this year we will reflect on the challenges to<br />
recognize situations, events and/or activities among the statically<br />
predefined pools and beyond - which is the current state of the art -<br />
and instead we will adopt an "open-ended view" on activity and context<br />
awareness. This may result in combinations of the automatic discovery<br />
of relevant patterns in sensor data, the experience sampling and<br />
wearable technologies to unobtrusively discover the semantic meaning<br />
of such patterns, the crowd-sourcing of dataset acquisition and<br />
annotation, and new "open-ended" human activity modeling techniques.</p>
				

				
			
				
				
				<h2 id="h735">CALL FOR CONTRIBUTIONS</h2>
				

				
			
				
				
				<p ><strong>- *Data collection*, *Corpus construction*.</strong><br />
Experiences or reports from data collection and/or corpus construction<br />
projects, including papers which describes the formats, styles and/or<br />
methodologies for data collection. Cloud-sourcing data collection and<br />
participatory sensing also could be included in this topic.<br />
<br />
<strong>- *Effectiveness of Data*, *Data Centric Research*.</strong><br />
There is a field of research based on the collected corpora, which is<br />
so called "data centric research". Also, we call for the experience of<br />
using large-scale human activity sensing corpora. Using large-scale<br />
corpora with an analysis by machine learning, there will be a large<br />
space for improving the performance of recognition results.<br />
<br />
<strong>- *Tools and Algorithms for Activity Recognition*.</strong><br />
If we have appropriate tools for the management of sensor data,<br />
activity recognition researchers could have more focused on their<br />
actual research theme. This is because the developed tools and<br />
algorithms are often not shared among the research community. In this<br />
workshop, we solicit reports on developed tools and algorithms for<br />
forwarding to the community.<br />
<br />
<strong>- *Real World Application and Experiences*.</strong><br />
Activity recognition "in the lab" usually works well. However, it does<br />
not scale well with real world data. In this workshop, we also solicit<br />
the experiences from real world applications. There is a huge gap<br />
between "lab" and "real world” environments . Large-scale human<br />
activity sensing corpora will help to overcome this gap.<br />
<br />
<strong>- *Sensing Devices and Systems*.</strong><br />
Data collection is not only performed by the "off-the-shelf" sensors<br />
but also the newly developed sensors which supply information which<br />
has not been investigated. There is also a research area about the<br />
development of new platform for data collection or the evaluation<br />
tools for collected data.<br />
<br />
In light of this year's special emphasis on open-ended contextual<br />
awareness, we wish cover these topics as well:<br />
<br />
<strong>- *Mobile Experience Sampling*, *Experience Sampling Strategies*.</strong><br />
Advances in experience sampling approaches, for instance intelligent<br />
user query or those using novel devices (e.g. smartwatches), are<br />
likely to play an important role to provide user-contributed<br />
annotations of their own activities.<br />
<br />
<strong>- *Unsupervised Pattern Discovery*.</strong><br />
Discovering meaningful patterns in sensor data in an unsupervised<br />
manner can be needed in the context of informing other elements of the<br />
system by inquiring the user and by triggering the annotation with<br />
crowd-sourcing.<br />
<br />
<strong>- *Dataset Acquisition and Annotation*, *Crowd-Sourcing*, *Web-Mining*.</strong><br />
A wide abundance of sensor data is potentially within the reach of<br />
users instrumented with their mobile phones and other<br />
wearables. Capitalizing on crowd-sourcing to create larger datasets in<br />
a cost effective manner may be critical to open-ended activity<br />
recognition. Many online datasets are also available and could be used<br />
to bootstrap recognition models.<br />
<br />
<strong>- *Transfer Learning*, *Semi-Supervised Learning*, *Lifelog Learning*.</strong><br />
The ability to translate recognition models across modalities or to<br />
use minimal forms of supervision would allow to reuse datasets in a<br />
wider range of domains and reduce the costs of acquiring annotations.<br />
<br />
<strong>- *Deep Learning*.</strong><br />
Together with the big success of deep learning in other AI domain, deep<br />
learning models are gradually playing an important role in activity<br />
recognition as well.</p>
				

				
			
				
				
				<h2 id="h737">AREAS OF INTEREST</h2>
				

				
			
				
				
				<ul >
<li>Human Activity Sensing Corpus</li>
<li>Large Scale Data Collection</li>
<li>Data Validation</li>
<li>Data Tagging / Labeling</li>
<li>Efficient Data Collection</li>
<li>Data Mining from Corpus</li>
<li>Automatic Segmentation</li>
<li>Performance Evaluation</li>
<li>Man-machine Interaction</li>
<li>Noise Robustness</li>
<li>Non Supervised Machine Learning</li>
<li>Sensor Data Fusion</li>
<li>Tools for Human Activity Corpus/Sensing</li>
<li>Participatory Sensing</li>
<li>Feature Extraction and Selection</li>
<li>Context Awareness</li>
<li>Pedestrian Navigation</li>
<li>Social Activities Analysis/Detection</li>
<li>Compressive Sensing</li>
<li>Sensing Devices</li>
<li>Lifelog Systems</li>
<li>Route Recognition/Detection</li>
<li>Wearable Application</li>
<li>Gait Analysis</li>
<li>Health-care Monitoring/Recommendation</li>
<li>Daily-life Worker Support</li>
<li>Deep Learning</li>
</ul>
				

				
			
				
				
				<h2 id="h739">FORMAT & TEMPLATE</h2>
				

				
			
				
				
				<p ><b>The paper must be in 6 pages in the 2-column format. References do not count to the page limit, but all texts and figures/tables must be in the first 6 pages.</b> Due to capacity reasons, some papers may be accepted as poster presentations during the workshop (not UbiComp/ISWC poster sessions) instead of oral presentations. We also plan to open the submissions for the papers rejected by the ISWC Note/Brief.<br />
<br />
ACM requires UbiComp/ISWC 2026 workshop submissions to use the double-column template. Please note that the template for submission is double-column format and the template for publication (camera-ready) is in single-column.<br />
Please carefully read <a href="https://www.ubicomp.org/ubicomp-iswc-2026/formatting/" target="_blank">Ubicomp website about the template</a>.<br />
<br />
<b>Submissions do not need to be anonymous</b>.<br />
All publications will be peer reviewed together with their contribution to the topic of the workshop.<br />
The accepted papers will be published in the UbiComp/ISWC 2025 adjunct proceedings, which will be included in the ACM Digital Library.<br />
</p>
				

				
			
				
				
				<h2 id="h741">SUBMISSION</h2>
				

				
			
				
				
				<p >Please submit your papers from <a href="https://new.precisionconference.com/submissions" target="_blank" rel="noopener noreferrer">https://new.precisionconference.com/submissions</a><br />
Make a new submission as follows:</p>
				

				
			
				
				
				<ol >
<li>Society: SIGCHI</li>
<li>Conference/Journal: UbiComp/ISWC 2026</li>
<li>Tack: UbiComp/ISWC 2026 14th Workshop on HASCA</li>
<li>"Go" button</li>
</ol>
				

				
			
				
				
				<h2 id="h747">SPECIAL SESSION</h2>
				

				
			
				
				
				<p >This year, the following challenges are held with HASCA.<br />
<br />
Sussex-Huawei Locomotion (SHL) Challenge<br />
<a href="http://www.shl-dataset.org/challenges/" target="_blank">http://www.shl-dataset.org/challenges/</a><br />
<br />
WEAR Dataset Challenge<br />
<a href="https://mariusbock.github.io/wear/challenge.html" target="_blank">https://mariusbock.github.io/wear/challenge.html</a></p>
				

				
			
				
				
				<h2 id="h749">CONTACT<br />
hasca-organizer[at]ml.hasc.jp</h2>
				

				

				<br class="clearHidden" />
			</div>
			]]></description>
			<category>cfp</category>
			<guid isPermaLink="true">http://hasca2026.hasc.jp/cfp/index.html</guid>
			<pubDate>Mon, 11 May 2026 17:26:53 +0900</pubDate>
		</item>
		<item>
			<dc:creator>kawaguti</dc:creator>
			<title>Organizers &amp; Committee</title>
			<link>http://hasca2026.hasc.jp/pc/index.html</link>
			<description><![CDATA[
			<div class="newsTextBox">
			
				
				
				<h2 id="h724">ORGANIZERS</h2>
				

				
			
				
				
				<ul >
<li>Kazuya MURAO (Ritsumeikan University, Japnan)</li>
<li>Yu ENOKIBORI (Nagoya University, Japan)</li>
<li>Hristijan GJORESKI (Ss. Cyril and Methodius University, N. Macedonia)</li>
<li>Paula LAGO (Concordia University, Canada)</li>
<li>Tsuyoshi OKITA (Kyushu Institute Technology, Japan)</li>
<li>Pekka SIIRTOLA (University of Oulu, Finland)</li>
<li>Kei HIROI (Kyoto University, Japan)</li>
<li>Philipp M. SCHOLL (University of Freiburg, Germany)</li>
<li>Mathias CILIBERTO (University of Sussex, UK)</li>
<li>Kenta URANO (Nagoya University, Japan)</li>
<li>Marius Bock (University of Siegen, Germany)</li>
</ul>
				

				
			
				
				
				<h2 id="h726">ADVISORY BOARDS</h2>
				

				
			
				
				
				<ul >
<li>Nobuo Kawaguchi (Nagoya University, Japan)</li>
<li>Nobuhiko Nishio (Ritsumeikan University, Japan)</li>
<li>Daniel Roggen (University of Sussex, UK)</li>
<li>Sozo Inoue (Kyushu Institute of Technology, Japan)</li>
<li>Susanna Pirttikangas (University of Oulu, Finland)</li>
<li>Kristof van Laerhoven (University of Freiburg, Germany)</li>
</ul>
				

				

				<br class="clearHidden" />
			</div>
			]]></description>
			<category>pc</category>
			<guid isPermaLink="true">http://hasca2026.hasc.jp/pc/index.html</guid>
			<pubDate>Mon, 11 May 2026 17:26:53 +0900</pubDate>
		</item>
	</channel>
</rss>