[hpc-announce] FLLM2024 CFP (Hybrid Event and Co-Sponsored by IEEE): The 2nd International Conference on Foundation and Large Language Models , 26-29 November, 2024 | Dubai, UAE
Gizem Varkonyi
emergingtechnetwork.publicity at gmail.com
Tue Jun 4 05:45:17 CDT 2024
[Apologies if you got multiple copies of this invitation]
The 2nd International Conference on Foundation and Large Language Models
(FLLM2024)
Hybrid Event
https://urldefense.us/v3/__https://fllm2024.fllm-conference.org/index.php__;!!G_uCfscf7eWS!camaTBLQ9ivOz_F2wUmSvE1jAx5KZO39KfCrBxmExgSysuxPqfGyFLwo9qXBLBOyTwXHn9zTuqhWHjQCtikehMaD4kZj9GYAtw$
26-29 November, 2024 | Dubai, UAE
Technically Co-Sponsored by IEEE UAE Section
*FLLM 2024 CFP:*
With the emergence of foundation models (FMs) and Large Language Models
(LLMs) that are trained on large amounts of data at scale and adaptable to
a wide range of downstream applications, Artificial intelligence is
experiencing a paradigm revolution. BERT, T5, ChatGPT, GPT-4, Falcon 180B,
Codex, DALL-E, Whisper, and CLIP are now the foundation for new
applications ranging from computer vision to protein sequence study and
from speech recognition to coding. Earlier models had a reputation of
starting from scratch with each new challenge. The capacity to experiment
with, examine, and comprehend the capabilities and potentials of
next-generation FMs is critical to undertaking this research and guiding
its path. Nevertheless, these models are currently inaccessible as the
resources required to train these models are highly concentrated in
industry, and even the assets (data, code) required to replicate their
training are frequently not released due to their demand in the real-time
industry. At the moment, mostly large tech companies such as OpenAI,
Google, Facebook, and Baidu can afford to construct FMs and LLMS. Despite
the expected widely publicized use of FMs and LLMS, we still lack a
comprehensive knowledge of how they operate, why they underperform, and
what they are even capable of because of their emerging global qualities.
To deal with these problems, we believe that much critical research on FMs
and LLMS would necessitate extensive multidisciplinary collaboration, given
their essentially social and technical structure.
The International Conference on Foundation and Large Language Models (FLLM)
addresses the architectures, applications, challenges, approaches, and
future directions. We invite the submission of original papers on all
topics related to FLLMs, with special interest in but not limited to:
- *Architectures and Systems*
- Transformers and Attention
- Bidirectional Encoding
- Autoregressive Models
- Massive GPU Systems
- Prompt Engineering
- Multimodal LLMs
- Fine-tuning
- *Challenges*
- Hallucination
- Cost of Creation and Training
- Energy and Sustainability Issues
- Integration
- Safety and Trustworthiness
- Interpretability
- Fairness
- Social Impact
- * Future Directions*
- Generative AI
- Explainability and EXplainable AI
- Retrieval Augmented Generation (RAG)
- Federated Learning for FLLM
- Large Language Models Fine-Tuning on Graphs
- Data Augmentation
- * Natural Language Processing Applications*
- Generation
- Summarization
- Rewrite
- Search
- Question Answering
- Language Comprehension and Complex Reasoning
- Clustering and Classification
- * Applications*
- Natural Language Processing
- Communication Systems
- Security and Privacy
- Image Processing and Computer Vision
- Life Sciences
- Financial Systems
*Submissions Guidelines and Proceedings*
Manuscripts should be prepared in 10-point font using the IEEE 8.5" x 11"
two-column format. All papers should be in PDF format, and submitted
electronically at Paper Submission Link. A full paper can be up to 8 pages
(including all figures, tables and references). Submitted papers must
present original unpublished research that is not currently under review
for any other conference or journal. Papers not following these guidelines
may be rejected without review. Also submissions received after the due
date, exceeding length limit, or not appropriately structured may also not
be considered. Authors may contact the Program Chair for further
information or clarification. All submissions are peer-reviewed by at least
three reviewers. Accepted papers will appear in the FLLM Proceeding, and be
published by the IEEE Computer Society Conference Publishing Services and
be submitted to IEEE Xplore for inclusion. Submitted papers must include
original work, and must not be under consideration for another conference
or journal. Submission of regular papers up to 8 pages and must follow the
IEEE paper format. Please include up to 7 keywords, complete postal and
email address, and fax and phone numbers of the corresponding author.
Authors of accepted papers are expected to present their work at the
conference. Submitted papers that are deemed of good quality but that could
not be accepted as regular papers will be accepted as short papers.
*Important Dates:*
- *Paper submission deadline: June 30, 2024*
- Notification of acceptance: September 15, 2024
- Camera-ready Submission: October 10, 2024
*Contact:*
Please send any inquiry on FLLM to: <emergingtechnetwork at gmail.com>
info at fllm-conference.org
More information about the hpc-announce
mailing list