Works

Here are some works of mine 📚

Publications & Manuscripts

Data Continuity Matters: Improving Sequence Modeling with Lipschitz Regularizer

International Conference on Learning Representations (ICLR), Spotlight (2023)

Eric Qu, Xufang Luo, Dongsheng Li

We discovered empirically and theoretically proved that many sequence models have different assumptions about the continuity of the input sequence. To utilize this property, we designed a regularizer that could alter the continuity of the input sequence and showed its effectiveness in various sequence models.

[Link] [Code]

CNN Kernels Can Be the Best Shapelets

Preprint (2023)

Eric Qu, Yansen Wang, Xufang Luo, Wenqiang He, Dongsheng Li

[Preprint] [Code]

Hyperbolic Convolution via Kernel Point Aggregation

arXiv:2306.08862 (2022)

Eric Qu, Dongmian Zou

We constructed a novel hyperbolic convolution operation (HKConv), which first correlates trainable local hyperbolic features with fixed hyperbolic kernel points, then aggregates the output features within a local neighborhood. HKConv enjoys equivariance to permutation of input and invariance to parallel transport of a local neighborhood.

[Preprint] [Code] [Poster]

Autoencoding Hyperbolic Representation for Adversarial Generation

arXiv:2201.12825, (2022)

Eric Qu, Dongmian Zou

We propose a generative model (HAEGAN) in hyperbolic space that is capable of generating complex data. Many specific operations and layers were proposed to ensure numerical stability. HAEGAN outperfroms SOTA in structure-related performance of the molecular generation experiment.

[Preprint] [Code] [Slides]

Lorentz Direct Concatenation for Stable Training in Hyperbolic Neural Networks

NeurIPS NuerReps Workshop, Poster, (2022)

Eric Qu, Dongmian Zou

We discussed an operation proposed in the HAEGAN paper, the Lorentz Direct Concatenation. Compared with concatenating in the tangent space, our method is more stable and better at preserving the hyperbolic distance.

[Link] [Poster]

Quantifying Nanoparticle Assembly States in a Polymer Matrix through Deep Learning

Macromolecules, 54 (7), 3034-3040, (2021)

Eric Qu, Andrew Matthew Jimenez, Sanat K. Kumar, Kai Zhang

We develop and apply a deep-learning based image analysis method to quantify the distribution of spherical NPs in a polymer matrix directly from their real-space TEM images.

[Link] [Code] [Dataset] [News]

In-situ AFM tracking of Nanoparticle Diffusion in Semicrystalline Polymers

ACS Macro Lett. 2022, 11, 6, 818–824, (2022)

Kamlesh Bornani, Nico Mendez, Abdullah S. Altorbaq, Alejandro Muller, Max Yueqian Lin, Eric Qu, Kai Zhang, Sanat K. Kumar, Linda S. Schadler

We design a model for detecting and tracking the drift of nanoparticles in TEM videos.

[Link]

Projects ( * indicates equal contribution)

Solving Sticky Hard Sphere Packing Problem through Deep Learning

Eric Qu, Kai Zhang, Dongmian Zou

Sticky hard sphere packing is a challenging problem in physics. In our method, we first map the packing state into a graph and use a modified Graph Isomorphism Network (GIN) to identify the valid packing with high accuracy. Then, a Mento Carlo Search Tree is trained to generate new packings with the reward based on the GIN.

Finding Optimal Order Parameter for Monodisperse Systems

Eric Qu, Max Yueqian Lin, Kai Zhang

Order parameter of a particle system describes whither it is more crystal-like or glass-like. The packing state could be represented by 3D point cloud. We proposed a novel Kernel Point Autoencoder model using KPConv as encoder and our Kernel Point Generator as decoder. Then, the bottleneck activation is extracted to be the order parameter.

Class Note for DKU MATH 408: Differential Geometry

Eric Qu

A well organized class note for differential geometry.

[Note]

Slides for Weekly Paper Reading Discussion

Eric Qu

In each week, we hold a paper reading discussion to present our works or latest interesting progresses. Here are some of my sildes.

[HAEGAN] [Diffusion Models] [S4 Model]

Last Updated: 7/25/2023, 7:38:58 AM