We are proud to announce that four members from MIND LAB have triumphed in the prestigious IEEE CIS FLAME Technical Challenge 2024. The team, comprising Dr. WU Xingyu, Mr. ZHOU Yu, Mr. CHEN Xianwei, and Dr. WU Jibin, demonstrated exceptional innovation and expertise to secure the First Prize, earning a reward valued at US$15,000 (approx. HK$117,000). The team was led by Prof. TAN Kay Chen from the MIND LAB.
​
The IEEE CIS FLAME Technical Challenge 2024 focuses on encouraging innovative combinations of large language models (LLMs) with computational intelligence techniques, aiming to explore novel applications across various domains. The competition invites teams from diverse organizations to showcase their ideas, emphasizing innovation as the primary criterion. Submissions were rigorously evaluated based on criteria such as computational efficiency, innovativeness, ethical considerations, and practical relevance.
​
The team’s winning entry featured a novel approach to resource-efficient LLM development. Leveraging a multi-objective evolutionary algorithm, they proposed an architecture-level merging technique to create cross-lingual LLMs. Unlike traditional methods requiring vast training data and computational power, their approach enabled the merging of existing model architectures to produce new LLMs optimized for multilingual tasks. The evaluation committee was particularly impressed by the team's innovative methodology, which tackled complex challenges such as search space explosion, layer alignment, and fairness among parent models. Their framework formalized model merging as a bi-level multi-objective optimization problem, combining parameter and architecture spaces to balance performance and resource constraints.
​
By addressing the high computational barriers associated with LLM development, the team’s work empowers researchers with limited resources to innovate in the field. Their achievement underscores the potential of combining computational intelligence with LLMs, marking a significant milestone in LLM research and development.