Subtleties of type inference

I am having difficulty understanding why the signature of the intended type is different from the expected one. Let's take an example (I tried to keep it as short as possible):

import Control.Applicative
import Data.Word
import Text.ParserCombinators.Parsec
import Text.ParserCombinators.Parsec.Token
import Text.Parsec.Language (emptyDef)
import Text.Parsec.Prim
import Data.Functor.Identity

--parseUInt' :: Num b => ParsecT String u Identity b
parseUInt' = fromInteger <$> decimal (makeTokenParser emptyDef)
--parseUInt1 = fromInteger <$> decimal (makeTokenParser emptyDef)
--parseUInt2 = fromInteger <$> decimal (makeTokenParser emptyDef)

parsePairOfInts = do
    x <- parseUInt'
    char ','
    y <- parseUInt'
    return $ (x, y)

parseLine :: String -> Either ParseError (Word32, Word8)
parseLine = parse parsePairOfInts "(error)"

main = print . show $ parseLine "1,2"

      

This code does NOT compile:

test.hs:21:19:
    Couldn't match type β€˜Word32’ with β€˜Word8’
    Expected type: Parsec String () (Word32, Word8)
      Actual type: ParsecT String () Identity (Word32, Word32)
    In the first argument of β€˜parse’, namely β€˜parsePairOfInts’
    In the expression: parse parsePairOfInts "(error)"
Failed, modules loaded: none.

      

But if I uncomment the type signature parseUInt'

it just compiles.

At the same time, if I ask for request type information in GHCi, it looks like this:

Ξ»>:t (fromInteger <$> decimal (makeTokenParser emptyDef))
(fromInteger <$> decimal (makeTokenParser emptyDef))
  :: Num b => ParsecT String u Identity b

      

But if I DO NOT explicitly specify the type signature, the type 'b' is somehow fixed to Word32

.

If I replace parseUInt'

with two different (but still the same implementation) functions parseUInt1

and parseUInt2

, compiling the code too.

I thought that if I did not specify the type of the function, the intended type should be the least restrictive ( Num b =>...

), but it is not.

What am I missing here?

+3


source to share


2 answers


I think it's scary MonomorphismRestriction

in action. If you do not provide a type signature then it ghc

tries to infer a specific type signature if that function is generated by a specific type elsewhere in the code. ghc

sees you are using the parse function Word32

as the first line parsePairOfInt

and then fixes parseUInt'

to that type before using the second using parseUInt'

two lines down. This results in a type error because the type has already been created for Word32

and the type should now be Word8

.



+8


source


It looks like a limitation of monomorphism . You have defined something that does not "look" like a polymorphic value, so the compiler inferred a monomorphic type for it.



This turns out to be not the type that you want, so you will need to understand that you intend polymorphism by adding a type signature.

+2


source







All Articles