This work presents enhancements to the stability and applicability of Cyclic DARTS (CDARTS), which is a Differentiable Architecture Search (DARTS) based method for neural architecture search (NAS). CDARTS employs a cyclic feedback mechanism to concurrently train the search and evaluation networks. The objective of this training protocol is to ensure that the search and evaluation networks generate similar outputs. However, CDARTS suffers from a discrepancy in the loss functions used by the evaluation networks during the search and retraining phases, leading to a sub-optimal search-phase evaluation network compared to the final evaluation network used during retraining. To address this issue, we propose ICDARTS, a revised approach that eliminates the reliance of the evaluation network weights on those of the search network. Additionally, we introduce a modified process for discretizing the search network’s “zero” operations, enabling their retention in the final evaluation networks. We evaluate the effectiveness of these changes through ablation studies on ICDARTS’ algorithm and network template. Furthermore, we explore techniques for expanding the search space of ICDARTS by enlarging its operation set and investigating alternative methods for discretizing its continuous search cells. These experiments yield networks with enhanced generalizability and introduce a novel approach for incorporating a dynamic search space into ICDARTS.
Live Search
Blocksy: Search Block
Posts
Discere veritus detraxit pri ut, sea ei dicunt theophrastus. Eum harum animal debitis cu
Melissa Peterson
Popular Posts
Contact Info
Lorem ipsum dolor sit amet has ignota putent ridens aliquid indoctum anad movet graece vimut omnes.
Blocksy: Contact Info
About Us
Useful Information
Vim in meis verterem menandri, ea iuvaret delectus verterem qui, nec ad ferri corpora.
Euismod nisi porta lorem mollis. Interdum velit euismod in pellentesque.