(i) pruning the unnecessary neural network connections in the beginning of the model setup rather than waiting until we already have concrete training result
1) Title: We have been training neural networks all wrong
Source: https://go.technologyreview.com/weve-been-training-neural-networks-all-wrong
2) Title: A new way to build tiny neural networks could create powerful AI on your phone
Source: https://www.technologyreview.com/s/613514/a-new-way-to-build-tiny-neural-networks-could-create-powerful-ai-on-your-phone
(ii) using Calculus Formulas
Title: A radical new neural network design (with calculus equations) could overcome big challenges in AI
Source: https://www.technologyreview.com/s/612561/a-radical-new-neural-network-design-could-overcome-big-challenges-in-ai/
What I observe is that there are still no new technology breakthroughs that enable AI to be not data dependent.
In my opinion, all the hypes surrounding AI in the recent years are built upon data-intensive AI technologies (e.g., Deep Learning, Transfer Learning, Reinforcement Learning and even the Generative Adversarial Network that creates Deepfake). None of these technologies are capable to generate useful insights without meaningful training data.
As a conclusion, I believe that data-intensive AI will not go away in the foreseeable future and the investment into quality data pipeline is a safe bet for any organisations that wish to invest in developing AI engineering capability.
The fundamental baseline of this AI investment strategy can be summed up with the wise quote by William Clement Stone - "Aim for the moon. If you miss, you may hit a star”. Even if organisations that have heavily invested in quality AI data pipeline failed to deliver the originally envisioned AI driven applications in the end of the intensive product development life cycle, companies will still be able to at least realize the commercial opportunity of “data as an asset” from the rich big data pipeline they have built up over the years.
0 Comments