FUNCoin - FUNC

FUNCoin FUNC price, market cap & charts

Live FUNCoin prices from all markets and FUNC coin market Capitalization. Stay up to date with the latest FUNCoin price movements and discussion. Check out our snapshot charts and see when there is an opportunity to buy or sell FUNCoin

Price

BTC 0.00000084
USD

0.01

Market Cap

USD

36,554

Change % (1H)

%

Change % (24H)

%

FUNCoin (FUNC) Historical Price & Volume Charts

What is FUNCoin?

FunCoin is a Bitshares-based asset that is to be used in various gaming projects, their promotional activities as well as for further development. FUNC's circulation is 100.000.000 tokens.

Genesis Date: 0001-01-01

Website: https://funcoin.io/

FUNCoin Wiki

beta

FUNCoin Key Financial Information

Mkt. CapUSD 36,554Volume 24HUSD 0
Mkt. Share0.00 %Available Supply5,848,581
Change % (1H)0.00 %Max Supply100,000,000
Change % (24H)0.00 %Total Supply100,000,000
Change % (7D)0.00 %Proof
AlgorithmUpated: 1 year ago

FUNCoin Historical Data

DatePriceVolume

FUNCoin Videos

Which Activation Function Should I Use?
FunCoin $FUNC Gained 45% During the Past Day

FUNCoin (FUNC) Reviews & Critics

All neural networks use activation functions, but the reasons behind using them are never clear! Let's discuss what activation functions are, when they should be ....

  • Love it
  • Great explanation of activation functions. Now I need to tweak my model.
  • Why the anchor eye pooping out
  • Today I did some testing. Created a neural network, that has some dense layers, then some LSTM layers, then some dense layers again, with linear units on top, since that was a regression problem. Then I've tried algorithmic hyperparameter optimisation, that involved activation function search. As it turns out, the best performing network had relu as activation for initial dense layers and tanh for the rest of the network. The second best (the loss was almost identical) had first tanh, then relu for lstm layers, then tanh again for last dense layers. Sigmoid ruined pretty much everything, and relu didn't really work for the last dense layers. My guess is that relu doesn't really work that well with negative numbers (I think). I may be wrong.
    Copyright © 2019 Coins Critics.
    Powered by PROTEK